Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 88
Filter
1.
Emerg Microbes Infect ; 13(1): 2332672, 2024 Dec.
Article in English | MEDLINE | ID: mdl-38517841

ABSTRACT

Uruguay experienced its first Chikungunya virus outbreak in 2023, resulting in a significant burden to its healthcare system. We conducted analysis based on real-time genomic surveillance (30 novel whole genomes) to offer timely insights into recent local transmission dynamics and eco-epidemiological factors behind its emergence and spread in the country.


Subject(s)
Chikungunya virus , Chikungunya virus/genetics , Uruguay/epidemiology , Americas/epidemiology , Disease Outbreaks , Genomics
2.
JMIR Public Health Surveill ; 10: e47673, 2024 01 09.
Article in English | MEDLINE | ID: mdl-38194263

ABSTRACT

Globally, millions of lives are impacted every year by infectious diseases outbreaks. Comprehensive and innovative surveillance strategies aiming at early alert and timely containment of emerging and reemerging pathogens are a pressing priority. Shortcomings and delays in current pathogen surveillance practices further disturbed informing responses, interventions, and mitigation of recent pandemics, including H1N1 influenza and SARS-CoV-2. We present the design principles of the architecture for an early-alert surveillance system that leverages the vast available data landscape, including syndromic data from primary health care, drug sales, and rumors from the lay media and social media to identify areas with an increased number of cases of respiratory disease. In these potentially affected areas, an intensive and fast sample collection and advanced high-throughput genome sequencing analyses would inform on circulating known or novel pathogens by metagenomics-enabled pathogen characterization. Concurrently, the integration of bioclimatic and socioeconomic data, as well as transportation and mobility network data, into a data analytics platform, coupled with advanced mathematical modeling using artificial intelligence or machine learning, will enable more accurate estimation of outbreak spread risk. Such an approach aims to readily identify and characterize regions in the early stages of an outbreak development, as well as model risk and patterns of spread, informing targeted mitigation and control measures. A fully operational system must integrate diverse and robust data streams to translate data into actionable intelligence and actions, ultimately paving the way toward constructing next-generation surveillance systems.


Subject(s)
Artificial Intelligence , Influenza A Virus, H1N1 Subtype , Humans , Influenza A Virus, H1N1 Subtype/genetics , Chromosome Mapping , Data Science , Disease Outbreaks/prevention & control
3.
Pathogens ; 12(12)2023 Dec 06.
Article in English | MEDLINE | ID: mdl-38133304

ABSTRACT

Arboviruses, i.e., viruses transmitted by blood-sucking arthropods, trigger significant global epidemics. Over the past 20 years, the frequency of the (re-)emergence of these pathogens, particularly those transmitted by Aedes and Culex mosquitoes, has dramatically increased. Therefore, understanding how human behavior is modulating population exposure to these viruses is of particular importance. This synthesis explores human behavioral factors driving human exposure to arboviruses, focusing on household surroundings, socio-economic status, human activities, and demographic factors. Household surroundings, such as the lack of water access, greatly influence the risk of arbovirus exposure by promoting mosquito breeding in stagnant water bodies. Socio-economic status, such as low income or low education, is correlated to an increased incidence of arboviral infections and exposure. Human activities, particularly those practiced outdoors, as well as geographical proximity to livestock rearing or crop cultivation, inadvertently provide favorable breeding environments for mosquito species, escalating the risk of virus exposure. However, the effects of demographic factors like age and gender can vary widely through space and time. While climate and environmental factors crucially impact vector development and viral replication, household surroundings, socio-economic status, human activities, and demographic factors are key drivers of arbovirus exposure. This article highlights that human behavior creates a complex interplay of factors influencing the risk of mosquito-borne virus exposure, operating at different temporal and spatial scales. To increase awareness among human populations, we must improve our understanding of these complex factors.

4.
Sci Adv ; 9(35): eadg9204, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37656782

ABSTRACT

Despite the considerable morbidity and mortality of yellow fever virus (YFV) infections in Brazil, our understanding of disease outbreaks is hampered by limited viral genomic data. Here, through a combination of phylogenetic and epidemiological models, we reconstructed the recent transmission history of YFV within different epidemic seasons in Brazil. A suitability index based on the highly domesticated Aedes aegypti was able to capture the seasonality of reported human infections. Spatial modeling revealed spatial hotspots with both past reporting and low vaccination coverage, which coincided with many of the largest urban centers in the Southeast. Phylodynamic analysis unraveled the circulation of three distinct lineages and provided proof of the directionality of a known spatial corridor that connects the endemic North with the extra-Amazonian basin. This study illustrates that genomics linked with eco-epidemiology can provide new insights into the landscape of YFV transmission, augmenting traditional approaches to infectious disease surveillance and control.


Subject(s)
Yellow Fever , Yellow fever virus , Humans , Yellow fever virus/genetics , Phylogeny , Brazil/epidemiology , Yellow Fever/epidemiology , Disease Outbreaks , Genomics
5.
medRxiv ; 2023 Sep 06.
Article in English | MEDLINE | ID: mdl-37732223

ABSTRACT

We report the first whole-genome sequences of Dengue Virus type I genotypes I and V from Uruguay, including the first cases ever reported in the country. Through timely genomic analysis, identification of these genotypes was possible, aiding in timely public health responses and intervention strategies to mitigate the impact of dengue outbreaks.

6.
medRxiv ; 2023 Aug 20.
Article in English | MEDLINE | ID: mdl-37646000

ABSTRACT

Uruguay experienced its first Chikungunya virus outbreak in 2023, resulting in a significant burden to its healthcare system. We conducted analysis based on real-time genomic surveillance (30 novel whole genomes) to offer timely insights into recent local transmission dynamics and eco-epidemiological factors behind its emergence and spread in the country.

7.
Philos Trans R Soc Lond B Biol Sci ; 378(1887): 20220282, 2023 10 09.
Article in English | MEDLINE | ID: mdl-37598709

ABSTRACT

Global access to deworming treatment is one of the public health success stories of low-income countries in the twenty-first century. Parasitic worm infections are among the most ubiquitous chronic infections of humans, and early success with mass treatment programmes for these infections was the key catalyst for the neglected tropical disease (NTD) agenda. Since the launch of the 'London Declaration' in 2012, school-based deworming programmes have become the world's largest public health interventions. WHO estimates that by 2020, some 3.3 billion school-based drug treatments had been delivered. The success of this approach was brought to a dramatic halt in April 2020 when schools were closed worldwide in response to the COVID-19 pandemic. These closures immediately excluded 1.5 billion children not only from access to education but also from all school-based health services, including deworming. WHO Pulse surveys in 2021 identified NTD treatment as among the most negatively affected health interventions worldwide, second only to mental health interventions. In reaction, governments created a global Coalition with the twin aims of reopening schools and of rebuilding more resilient school-based health systems. Today, some 86 countries, comprising more than half the world's population, are delivering on this response, and school-based coverage of some key school-based programmes exceeds those from January 2020. This paper explores how science, and a combination of new policy and epidemiological perspectives that began in the 1980s, led to the exceptional growth in school-based NTD programmes after 2012, and are again driving new momentum in response to the COVID-19 pandemic. This article is part of the theme issue 'Challenges and opportunities in the fight against neglected tropical diseases: a decade from the London Declaration on NTDs'.


Subject(s)
COVID-19 , Pandemics , Child , Humans , Pandemics/prevention & control , COVID-19/epidemiology , COVID-19/prevention & control , Schools , Heart Rate , London , Neglected Diseases/epidemiology , Neglected Diseases/prevention & control
8.
Article in English | MEDLINE | ID: mdl-37569036

ABSTRACT

There is evidence of the efficacy of collaborative health interventions with pharmacies and primary care providers but little of its real-world effectiveness. We aimed to assess the effectiveness and discuss the design and challenges of hypertension and hyperlipidemia management between pharmacies and primary care providers using real-world data exchange between providers and experimental bundled payment. This was a pragmatic, quasi-experimental controlled trial. We collected patient-level data from primary care prescription claims and Electronic Medical Record databases, a pharmacy claims database, and patient telephone surveys at several time points. The primary outcomes were changes in blood pressure and total cholesterol. We used matched controls with difference-in-differences estimators in a Generalized Linear Model (GLM) and controlled interrupted time series (CITS). We collected additional data for economic and qualitative studies. A total of 6 Primary Care Units, 20 pharmacies, and 203 patients entered the study. We were not able to observe significant differences in the effect of intervention vs. control. We experienced challenges that required creative strategies. This real-world trial was not able to show effectiveness, likely due to limitations in the primary care technology which affected the sample size. It offers, however, valuable lessons on methods, strategies, and data sources, paving the way for more real-world effectiveness trials to advance value-based healthcare.

9.
Braz J Phys Ther ; 27(3): 100504, 2023.
Article in English | MEDLINE | ID: mdl-37146510

ABSTRACT

BACKGROUND: Insufficient sleep is common nowadays and it can be associated with chronic pain. OBJECTIVE: To describe the main polysomnographic findings in patients with chronic musculoskeletal pain and to estimate the association between sleep quality, polysomnography variables and chronic musculoskeletal pain. METHODS: This cross-sectional research analyzed a database from polysomnography type 1 exams results and then collected data via an electronic form from these patients. The form collected sociodemographic data and presented clinical questionnaires for measuring sleep quality, sleepiness, pain intensity and central sensitization signs. Pearson's correlation coefficient and odds ratio were used to estimate the associations. RESULTS: The mean age of the respondents was 55.1 (SD 13.4) years. The mean score of the Central Sensitization Inventory showed signs of central sensitization (50.1; SD 13.4) in the participants. Most patients (86%) had 1 or more nocturnal awakenings, 90% had one or more episodes of sleep apnea, 47% had Rapid Eye Movement sleep phase latency greater than 70-120 min and the mean sleep efficiency among all participants was 81.6%. The Pittsburgh Sleep Quality Index score was correlated with the CSI score (r = 0.55; 95% CI: 0.45, 0.61). People with central sensitization signs have 2.6 times more chance to present sleep episodes of blood oxygen saturation below 90% (OR = 2.62; 95% CI:1.23, 6.47). CONCLUSION: Most people with central sensitization signs had poor sleep quality, night waking episodes and specific disturbances in sleep phases. The findings showed association between central sensitization, sleep quality, nocturnal awakening, and changes in blood oxygen saturation during sleep.


Subject(s)
Chronic Pain , Musculoskeletal Pain , Sleep Initiation and Maintenance Disorders , Humans , Middle Aged , Chronic Pain/diagnosis , Central Nervous System Sensitization , Sleep Quality , Cross-Sectional Studies
10.
Cell Host Microbe ; 31(6): 861-873, 2023 06 14.
Article in English | MEDLINE | ID: mdl-36921604

ABSTRACT

The COVID-19 pandemic galvanized the field of virus genomic surveillance, demonstrating its utility for public health. Now, we must harness the momentum that led to increased infrastructure, training, and political will to build a sustainable global genomic surveillance network for other epidemic and endemic viruses. We suggest a generalizable modular sequencing framework wherein users can easily switch between virus targets to maximize cost-effectiveness and maintain readiness for new threats. We also highlight challenges associated with genomic surveillance and when global inequalities persist. We propose solutions to mitigate some of these issues, including training and multilateral partnerships. Exploring alternatives to clinical sequencing can also reduce the cost of surveillance programs. Finally, we discuss how establishing genomic surveillance would aid control programs and potentially provide a warning system for outbreaks, using a global respiratory virus (RSV), an arbovirus (dengue virus), and a regional zoonotic virus (Lassa virus) as examples.


Subject(s)
COVID-19 , Viruses , Humans , Pandemics , Disease Outbreaks , Public Health
11.
Epidemiol Infect ; 151: e5, 2022 12 16.
Article in English | MEDLINE | ID: mdl-36524247

ABSTRACT

Quantitative information on epidemiological quantities such as the incubation period and generation time of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) variants is scarce. We analysed a dataset collected during contact tracing activities in the province of Reggio Emilia, Italy, throughout 2021. We determined the distributions of the incubation period for the Alpha and Delta variants using information on negative polymerase chain reaction tests and the date of last exposure from 282 symptomatic cases. We estimated the distributions of the intrinsic generation time using a Bayesian inference approach applied to 9724 SARS-CoV-2 cases clustered in 3545 households where at least one secondary case was recorded. We estimated a mean incubation period of 4.9 days (95% credible intervals, CrI, 4.4-5.4) for Alpha and 4.5 days (95% CrI 4.0-5.0) for Delta. The intrinsic generation time was estimated to have a mean of 7.12 days (95% CrI 6.27-8.44) for Alpha and of 6.52 days (95% CrI 5.54-8.43) for Delta. The household serial interval was 2.43 days (95% CrI 2.29-2.58) for Alpha and 2.74 days (95% CrI 2.62-2.88) for Delta, and the estimated proportion of pre-symptomatic transmission was 48-51% for both variants. These results indicate limited differences in the incubation period and intrinsic generation time of SARS-CoV-2 variants Alpha and Delta compared to ancestral lineages.


Subject(s)
COVID-19 , SARS-CoV-2 , Humans , SARS-CoV-2/genetics , COVID-19/epidemiology , Contact Tracing , Bayes Theorem , Infectious Disease Incubation Period
12.
Sensors (Basel) ; 22(23)2022 Nov 29.
Article in English | MEDLINE | ID: mdl-36501978

ABSTRACT

Pain is a complex phenomenon that arises from the interaction of multiple neuroanatomic and neurochemical systems with several cognitive and affective processes. Nowadays, the assessment of pain intensity still relies on the use of self-reports. However, recent research has shown a connection between the perception of pain and exacerbated stress response in the Autonomic Nervous System. As a result, there has been an increasing analysis of the use of autonomic reactivity with the objective to assess pain. In the present study, the methods include pre-processing, feature extraction, and feature analysis. For the purpose of understanding and characterizing physiological responses of pain, different physiological signals were, simultaneously, recorded while a pain-inducing protocol was performed. The obtained results, for the electrocardiogram (ECG), showed a statistically significant increase in the heart rate, during the painful period compared to non-painful periods. Additionally, heart rate variability features demonstrated a decrease in the Parasympathetic Nervous System influence. The features from the electromyogram (EMG) showed an increase in power and contraction force of the muscle during the pain induction task. Lastly, the electrodermal activity (EDA) showed an adjustment of the sudomotor activity, implying an increase in the Sympathetic Nervous System activity during the experience of pain.


Subject(s)
Autonomic Nervous System , Sympathetic Nervous System , Humans , Autonomic Nervous System/physiology , Heart Rate/physiology , Sympathetic Nervous System/physiology , Pain , Electrocardiography
13.
World J Gastrointest Surg ; 14(11): 1297-1309, 2022 Nov 27.
Article in English | MEDLINE | ID: mdl-36504511

ABSTRACT

BACKGROUND: Colorectal anastomotic leakage (CAL), a severe postoperative complication, is associated with high morbidity, hospital readmission, and overall healthcare costs. Early detection of CAL remains a challenge in clinical practice. However, some decision models have been developed to increase the diagnostic accuracy of this event. AIM: To develop a score based on easily accessible variables to detect CAL early. METHODS: Based on the least absolute shrinkage and selection operator method, a predictive classification system was developed [Early ColoRectAL Leakage (E-CRALL) score] from a prospective observational, single center cohort, carried out in a colorectal division from a non-academic hospital. The score performance and CAL threshold from postoperative day (POD) 3 to POD5 were estimated. Based on a precise analytical decision model, the standard clinical practice was compared with the E-CRALL adoption on POD3, POD4, or POD5. A cost-minimization analysis was conducted, on the assumption that all alternatives delivered similar health-related effects. RESULTS: In this study, 396 patients who underwent colorectal resection surgery with anastomosis, and 6.3% (n = 25) developed CAL. Most of the patients who developed CAL (n = 23; 92%) were diagnosed during the first hospital admission, with a median time of diagnosis of 9.0 ± 6.8 d. From POD3 to POD5, the area under the receiver operating characteristic curve of the E-CRALL score was 0.82, 0.84, and 0.95, respectively. On POD5, if a threshold of 8.29 was chosen, 87.4% of anastomotic failures were identified with E-CRALL adoption. Additionally, score usage could anticipate CAL diagnosis in an average of 5.2 d and 4.1 d, if used on POD3 and POD5, respectively. Regardless of score adoption, episode comprehensive costs were markedly greater (up to four times) in patients who developed CAL in comparison with patients who did not develop CAL. Nonetheless, the use of the E-CRALL warning score was associated with cost savings of €421442.20, with most (92.9%) of the savings from patients who did not develop CAL. CONCLUSION: The E-CRALL score is an accessible tool to predict CAL at an early timepoint. Additionally, E-CRALL can reduce overall healthcare costs, mainly in the reduction of hospital costs, independent of whether a patient developed CAL.

15.
World J Gastroenterol ; 28(24): 2758-2774, 2022 Jun 28.
Article in English | MEDLINE | ID: mdl-35979163

ABSTRACT

BACKGROUND: Colorectal anastomotic leakage (CAL) is one of the most dreaded complications after colorectal surgery, with an incidence that can be as high as 27%. This event is associated with increased morbidity and mortality; therefore, its early diagnosis is crucial to reduce clinical consequences and costs. Some biomarkers have been suggested as laboratory tools for the diagnosis of CAL. AIM: To assess the usefulness of plasma C-reactive protein (CRP) and calprotectin (CLP) as early predictors of CAL. METHODS: A prospective monocentric observational study was conducted including patients who underwent colorectal resection with anastomosis, from March 2017 to August 2019. Patients were divided into three groups: G1 - no complications; G2 - complications not related to CAL; and G3 - CAL. Five biomarkers were measured and analyzed in the first 5 postoperative days (PODs), namely white blood cell (WBC) count, eosinophil cell count (ECC), CRP, CLP, and procalcitonin (PCT). Clinical criteria, such as abdominal pain and clinical condition, were also assessed. The correlation between biomarkers and CAL was evaluated. Receiver operating characteristic (ROC) curve analysis was used to compare the accuracy of these biomarkers as predictors of CAL, and the area under the ROC curve (AUROC), specificity, sensitivity, positive predictive value, and negative predictive value (NPV) during this period were estimated. RESULTS: In total, 25 of 396 patients developed CAL (6.3%), and the mean time for this diagnosis was 9.0 ± 6.8 d. Some operative characteristics, such as surgical approach, blood loss, intraoperative complications, and duration of the procedure, were notably related to the development of CAL. The length of hospital stay was markedly higher in the group that developed CAL compared with the group with complications other than CAL and the group with no complications (median of 21 d vs 13 d and 7 d respectively; P < 0.001). For abdominal pain, the best predictive performance was on POD4 and POD5, with the largest AUROC of 0.84 on POD4. Worsening of the clinical condition was associated with the diagnosis of CAL, presenting a higher predictive effect on POD5, with an AUROC of 0.9. WBC and ECC showed better predictive effects on POD5 (AUROC = 0.62 and 0.7, respectively). Those markers also presented a high NPV (94%-98%). PCT had the best predictive effect on POD5 (AUROC = 0.61), although it presented low accuracy. However, this biomarker revealed a high NPV on POD3, POD4, and POD5 (96%, 95%, and 96%, respectively). The mean CRP value on POD5 was significantly higher in the group that developed CAL compared with the group without complications (195.5 ± 139.9 mg/L vs 59.5 ± 43.4 mg/L; P < 0.00001). On POD5, CRP had a NPV of 98%. The mean CLP value on POD3 was significantly higher in G3 compared with G1 (5.26 ± 3.58 µg/mL vs 11.52 ± 6.81 µg/mL; P < 0.00005). On POD3, the combination of CLP and CRP values showed a high diagnostic accuracy (AUROC = 0.82), providing a 5.2 d reduction in the time to CAL diagnosis. CONCLUSION: CRP and CLP are moderate predictors of CAL. However, the combination of these biomarkers presents an increased diagnostic accuracy, potentially decreasing the time to CAL diagnosis.


Subject(s)
Anastomotic Leak , Colorectal Neoplasms , Abdominal Pain/complications , Anastomotic Leak/diagnosis , Anastomotic Leak/etiology , Biomarkers , C-Reactive Protein/analysis , Colorectal Neoplasms/complications , Colorectal Neoplasms/diagnosis , Colorectal Neoplasms/surgery , Early Diagnosis , Humans , Leukocyte L1 Antigen Complex , Prospective Studies , ROC Curve
16.
PLoS One ; 17(8): e0272820, 2022.
Article in English | MEDLINE | ID: mdl-36037207

ABSTRACT

School and college reopening-closure policies are considered one of the most promising non-pharmaceutical interventions for mitigating infectious diseases. Nonetheless, the effectiveness of these policies is still debated, largely due to the lack of empirical evidence on behavior during implementation. We examined U.S. college reopenings' association with changes in human mobility within campuses and in COVID-19 incidence in the counties of the campuses over a twenty-week period around college reopenings in the Fall of 2020. We used an integrative framework, with a difference-in-differences design comparing areas with a college campus, before and after reopening, to areas without a campus and a Bayesian approach to estimate the daily reproductive number (Rt). We found that college reopenings were associated with increased campus mobility, and increased COVID-19 incidence by 4.9 cases per 100,000 (95% confidence interval [CI]: 2.9-6.9), or a 37% increase relative to the pre-period mean. This reflected our estimate of increased transmission locally after reopening. A greater increase in county COVID-19 incidence resulted from campuses that drew students from counties with high COVID-19 incidence in the weeks before reopening (χ2(2) = 8.9, p = 0.012) and those with a greater share of college students, relative to population (χ2(2) = 98.83, p < 0.001). Even by Fall of 2022, large shares of populations remained unvaccinated, increasing the relevance of understanding non-pharmaceutical decisions over an extended period of a pandemic. Our study sheds light on movement and social mixing patterns during the closure-reopening of colleges during a public health threat, and offers strategic instruments for benefit-cost analyses of school reopening/closure policies.


Subject(s)
COVID-19 , Bayes Theorem , COVID-19/epidemiology , Humans , Incidence , Pandemics/prevention & control , United States/epidemiology , Universities
17.
Nat Commun ; 13(1): 4910, 2022 08 20.
Article in English | MEDLINE | ID: mdl-35987759

ABSTRACT

Appropriate isolation guidelines for COVID-19 patients are warranted. Currently, isolating for fixed time is adopted in most countries. However, given the variability in viral dynamics between patients, some patients may no longer be infectious by the end of isolation, whereas others may still be infectious. Utilizing viral test results to determine isolation length would minimize both the risk of prematurely ending isolation of infectious patients and the unnecessary individual burden of redundant isolation of noninfectious patients. In this study, we develop a data-driven computational framework to compute the population-level risk and the burden of different isolation guidelines with rapid antigen tests (i.e., lateral flow tests). Here, we show that when the detection limit is higher than the infectiousness threshold values, additional consecutive negative results are needed to ascertain infectiousness status. Further, rapid antigen tests should be designed to have lower detection limits than infectiousness threshold values to minimize the length of prolonged isolation.


Subject(s)
COVID-19 , COVID-19/diagnosis , Humans , SARS-CoV-2
18.
BMC Infect Dis ; 22(1): 656, 2022 Jul 28.
Article in English | MEDLINE | ID: mdl-35902832

ABSTRACT

BACKGROUND: Multiple waves of the COVID-19 epidemic have hit most countries by the end of 2021. Most of those waves are caused by emergence and importation of new variants. To prevent importation of new variants, combination of border control and contact tracing is essential. However, the timing of infection inferred by interview is influenced by recall bias and hinders the contact tracing process. METHODS: We propose a novel approach to infer the timing of infection, by employing a within-host model to capture viral load dynamics after the onset of symptoms. We applied this approach to ascertain secondary transmission which can trigger outbreaks. As a demonstration, the 12 initial reported cases in Singapore, which were considered as imported because of their recent travel history to Wuhan, were analyzed to assess whether they are truly imported. RESULTS: Our approach suggested that 6 cases were infected prior to the arrival in Singapore, whereas other 6 cases might have been secondary local infection. Three among the 6 potential secondary transmission cases revealed that they had contact history to previously confirmed cases. CONCLUSIONS: Contact trace combined with our approach using viral load data could be the key to mitigate the risk of importation of new variants by identifying cases as early as possible and inferring the timing of infection with high accuracy.


Subject(s)
COVID-19 , SARS-CoV-2 , Contact Tracing , Humans , Travel , Viral Load
19.
Nat Commun ; 13(1): 3319, 2022 06 09.
Article in English | MEDLINE | ID: mdl-35680843

ABSTRACT

Public policy and academic debates regarding pandemic control strategies note disease-economy trade-offs, often prioritizing one outcome over the other. Using a calibrated, coupled epi-economic model of individual behavior embedded within the broader economy during a novel epidemic, we show that targeted isolation strategies can avert up to 91% of economic losses relative to voluntary isolation strategies. Unlike widely-used blanket lockdowns, economic savings of targeted isolation do not impose additional disease burdens, avoiding disease-economy trade-offs. Targeted isolation achieves this by addressing the fundamental coordination failure between infectious and susceptible individuals that drives the recession. Importantly, we show testing and compliance frictions can erode some of the gains from targeted isolation, but improving test quality unlocks the majority of the benefits of targeted isolation.


Subject(s)
Pandemics , Public Policy , Humans , Income , Pandemics/prevention & control
SELECTION OF CITATIONS
SEARCH DETAIL
...