Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 36
Filter
1.
BMC Health Serv Res ; 24(1): 556, 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38693557

ABSTRACT

OBJECTIVE: Long waiting times for elective hospital treatments are common in many countries. This study seeks to address a deficit in the literature concerning the effect of long waits on the wider consumption of healthcare resources. METHODS: We carried out a retrospective treatment-control study in a healthcare system in South West England from 15 June 2021 to 15 December 2021. We compared weekly contacts with health services of patients waiting over 18 weeks for treatment ('Treatments') and people not on a waiting list ('Controls'). Controls were matched to Treatments based on age, sex, deprivation and multimorbidity. Treatments were stratified by the clinical specialty of the awaited hospital treatment, with healthcare usage assessed over various healthcare settings. Wilcoxon signed-rank tests assessed whether there was an increase in healthcare utilisation and bootstrap resampling was used to estimate the magnitude of any differences. RESULTS: A total of 44,616 patients were waiting over 18 weeks (the constitutional target in England) for treatment during the study period. There was an increase (p < 0.0004) in healthcare utilisation for all specialties. Patients in the Cardiothoracic Surgery specialty had the largest increase, with 17.9 [interquartile-range: 4.3, 33.8] additional contacts with secondary care and 17.3 [-1.1, 34.1] additional prescriptions per year. CONCLUSION: People waiting for treatment consume higher levels of healthcare than comparable individuals not on a waiting list. These findings are relevant for clinicians and managers in better understanding patient need and reducing harm. Results also highlight the possible 'false economy' in failing to promptly resolve long elective waits.


Subject(s)
Elective Surgical Procedures , Patient Acceptance of Health Care , Waiting Lists , Humans , Retrospective Studies , Male , Female , Middle Aged , Elective Surgical Procedures/statistics & numerical data , Elective Surgical Procedures/economics , Aged , Patient Acceptance of Health Care/statistics & numerical data , England , Adult , Case-Control Studies , United Kingdom
2.
Int J Ment Health Syst ; 18(1): 12, 2024 Mar 07.
Article in English | MEDLINE | ID: mdl-38448987

ABSTRACT

BACKGROUND: COVID-19 has had a significant impact on people's mental health and mental health services. During the first year of the pandemic, existing demand was not fully met while new demand was generated, resulting in large numbers of people requiring support. To support mental health services to recover without being overwhelmed, it was important to know where services will experience increased pressure, and what strategies could be implemented to mitigate this. METHODS: We implemented a computer simulation model of patient flow through an integrated mental health service in Southwest England covering General Practice (GP), community-based 'talking therapies' (IAPT), acute hospital care, and specialist care settings. The model was calibrated on data from 1 April 2019 to 1 April 2021. Model parameters included patient demand, service-level length of stay, and probabilities of transitioning to other care settings. We used the model to compare 'do nothing' (baseline) scenarios to 'what if' (mitigation) scenarios, including increasing capacity and reducing length of stay, for two future demand trajectories from 1 April 2021 onwards. RESULTS: The results from the simulation model suggest that, without mitigation, the impact of COVID-19 will be an increase in pressure on GP and specialist community based services by 50% and 50-100% respectively. Simulating the impact of possible mitigation strategies, results show that increasing capacity in lower-acuity services, such as GP, causes a shift in demand to other parts of the mental health system while decreasing length of stay in higher acuity services is insufficient to mitigate the impact of increased demand. CONCLUSION: In capturing the interrelation of patient flow related dynamics between various mental health care settings, we demonstrate the value of computer simulation for assessing the impact of interventions on system flow.

3.
Alzheimers Dement ; 19(12): 5970-5987, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37768001

ABSTRACT

INTRODUCTION: Experimental models are essential tools in neurodegenerative disease research. However, the translation of insights and drugs discovered in model systems has proven immensely challenging, marred by high failure rates in human clinical trials. METHODS: Here we review the application of artificial intelligence (AI) and machine learning (ML) in experimental medicine for dementia research. RESULTS: Considering the specific challenges of reproducibility and translation between other species or model systems and human biology in preclinical dementia research, we highlight best practices and resources that can be leveraged to quantify and evaluate translatability. We then evaluate how AI and ML approaches could be applied to enhance both cross-model reproducibility and translation to human biology, while sustaining biological interpretability. DISCUSSION: AI and ML approaches in experimental medicine remain in their infancy. However, they have great potential to strengthen preclinical research and translation if based upon adequate, robust, and reproducible experimental data. HIGHLIGHTS: There are increasing applications of AI in experimental medicine. We identified issues in reproducibility, cross-species translation, and data curation in the field. Our review highlights data resources and AI approaches as solutions. Multi-omics analysis with AI offers exciting future possibilities in drug discovery.


Subject(s)
Dementia , Neurodegenerative Diseases , Humans , Artificial Intelligence , Reproducibility of Results , Machine Learning
4.
Alzheimers Dement ; 19(12): 5934-5951, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37639369

ABSTRACT

Artificial intelligence (AI) and machine learning (ML) approaches are increasingly being used in dementia research. However, several methodological challenges exist that may limit the insights we can obtain from high-dimensional data and our ability to translate these findings into improved patient outcomes. To improve reproducibility and replicability, researchers should make their well-documented code and modeling pipelines openly available. Data should also be shared where appropriate. To enhance the acceptability of models and AI-enabled systems to users, researchers should prioritize interpretable methods that provide insights into how decisions are generated. Models should be developed using multiple, diverse datasets to improve robustness, generalizability, and reduce potentially harmful bias. To improve clarity and reproducibility, researchers should adhere to reporting guidelines that are co-produced with multiple stakeholders. If these methodological challenges are overcome, AI and ML hold enormous promise for changing the landscape of dementia research and care. HIGHLIGHTS: Machine learning (ML) can improve diagnosis, prevention, and management of dementia. Inadequate reporting of ML procedures affects reproduction/replication of results. ML models built on unrepresentative datasets do not generalize to new datasets. Obligatory metrics for certain model structures and use cases have not been defined. Interpretability and trust in ML predictions are barriers to clinical translation.


Subject(s)
Artificial Intelligence , Dementia , Humans , Reproducibility of Results , Machine Learning , Research Design , Dementia/diagnosis
5.
ArXiv ; 2023 Mar 02.
Article in English | MEDLINE | ID: mdl-36911275

ABSTRACT

INTRODUCTION: Machine learning (ML) has been extremely successful in identifying key features from high-dimensional datasets and executing complicated tasks with human expert levels of accuracy or greater. METHODS: We summarize and critically evaluate current applications of ML in dementia research and highlight directions for future research. RESULTS: We present an overview of ML algorithms most frequently used in dementia research and highlight future opportunities for the use of ML in clinical practice, experimental medicine, and clinical trials. We discuss issues of reproducibility, replicability and interpretability and how these impact the clinical applicability of dementia research. Finally, we give examples of how state-of-the-art methods, such as transfer learning, multi-task learning, and reinforcement learning, may be applied to overcome these issues and aid the translation of research to clinical practice in the future. DISCUSSION: ML-based models hold great promise to advance our understanding of the underlying causes and pathological mechanisms of dementia.

6.
BMC Med Inform Decis Mak ; 23(1): 29, 2023 02 07.
Article in English | MEDLINE | ID: mdl-36750952

ABSTRACT

BACKGROUND: In the United Kingdom, Emergency Departments (EDs) are under significant pressure due to an ever-increasing number of attendances. Understanding how the capacity of other urgent care services and the health of a population may influence ED attendances is imperative for commissioners and policy makers to develop long-term strategies for reducing this pressure and improving quality and safety. METHODS: We developed a novel multi-granular stacked regression (MGSR) model using publicly available data to predict future mean monthly ED attendances within Clinical Commissioning Group regions in England. The MGSR combines measures of population health and health service capacity in other related settings. We assessed model performance using the R-squared statistic, measuring variance explained, and the Mean Absolute Percentage Error (MAPE), measuring forecasting accuracy. We used the MGSR to forecast ED demand over a 4-year period under hypothetical scenarios where service capacity is increased, or population health is improved. RESULTS: Measures of service capacity explain 41 ± 4% of the variance in monthly ED attendances and measures of population health explain 62 ± 22%. The MGSR leads to an overall improvement in performance, with an R-squared of 0.79 ± 0.02 and MAPE of 3% when forecasting mean monthly ED attendances per CCG. Using the MGSR to forecast long-term demand under different scenarios, we found improving population health would reduce peak ED attendances per CCG by approximately 1000 per month after 2 years. CONCLUSION: Combining models of population health and wider urgent care service capacity for predicting monthly ED attendances leads to an improved performance compared to each model individually. Policies designed to improve population health will reduce ED attendances and enhance quality and safety in the long-term.


Subject(s)
Emergency Service, Hospital , Humans , England , United Kingdom , Forecasting
7.
Int J Mol Sci ; 24(2)2023 Jan 13.
Article in English | MEDLINE | ID: mdl-36675153

ABSTRACT

Folate deficiencies, folate imbalance and associated abnormal methylation are associated with birth defects, developmental delays, neurological conditions and diseases. In the hydrocephalic Texas (H-Tx) rat, 10-formyl tetrahydrofolate dehydrogenase (FDH) is reduced or absent from the CSF and the nuclei of cells in the brain and liver and this is correlated with decreased DNA methylation. In the present study, we tested whether impaired folate metabolism or methylation exists in sexually mature, unaffected H-Tx rats, which may explain the propagation of hydrocephalus in their offspring. We compared normal Sprague Dawley (SD, n = 6) rats with untreated H-Tx (uH-Tx, n = 6 and folate-treated H-Tx (TrH-Tx, n = 4). Structural abnormalities were observed in the testis of uH-Tx rats, with decreased methylation, increased demethylation, and cell death, particularly of sperm. FDH and FRα protein expression was increased in uH-Tx males but not in folate-treated males but tissue folate levels were unchanged. 5-Methylcytosine was significantly reduced in untreated and partially restored in treated individuals, while 5-hydroxymethylcytosine was not significantly changed. Similarly, a decrease in DNA-methyltransferase-1 expression in uH-Tx rats was partially reversed with treatment. The data expose a significant germline methylation error in unaffected adult male H-Tx rats from which hydrocephalic offspring are obtained. Reduced methylation in the testis and sperm was partially recovered by treatment with folate supplements leading us to conclude that this neurological disorder may not be completely eradicated by maternal supplementation alone.


Subject(s)
Folic Acid , Hydrocephalus , Animals , Male , Rats , DNA Methylation , Folic Acid/metabolism , Folic Acid/pharmacology , Folic Acid/therapeutic use , Rats, Sprague-Dawley , Semen/metabolism , Hydrocephalus/congenital , Hydrocephalus/drug therapy , Hydrocephalus/genetics , Hydrocephalus/metabolism , Disease Models, Animal , Folate Receptor 1/genetics , Folate Receptor 1/metabolism
8.
Stroke ; 53(9): 2758-2767, 2022 09.
Article in English | MEDLINE | ID: mdl-35862194

ABSTRACT

BACKGROUND: Expert opinion is that about 20% of emergency stroke patients should receive thrombolysis. Currently, 11% to 12% of patients in England and Wales receive thrombolysis, ranging from 2% to 24% between hospitals. The aim of this study was to assess how much variation is due to differences in local patient populations, and how much is due to differences in clinical decision-making and stroke pathway performance, while estimating a realistic target thrombolysis use. METHODS: Anonymised data for 246 676 emergency stroke admissions to 132 acute hospitals in England and Wales between 2016 and 2018 was obtained from the Sentinel Stroke National Audit Programme data. We used machine learning to learn decisions on who to give thrombolysis to at each hospital. We used clinical pathway simulation to model effects of changing pathway performance. Qualitative research was used to assess clinician attitudes to these methods. Three changes were modeled: (1) arrival-to-treatment in 30 minutes, (2) proportion of patients with determined stroke onset times set to at least the national upper quartile, (3) thrombolysis decisions made based on majority vote of a benchmark set of hospitals. RESULTS: Of the modeled changes, any single change was predicted to increase national thrombolysis use from 11.6% to between 12.3% to 14.5% (clinical decision-making having the most effect). Combined, these changes would be expected to increase thrombolysis to 18.3%, but there would still be significant variation between hospitals depending on local patient population. Clinicians engaged well with the modeling, but those from hospitals with lower thrombolysis use were most cautious about the methods. CONCLUSIONS: Machine learning and clinical pathway simulation may be applied at scale to national stroke audit data, allowing extended use and analysis of audit data. Stroke thrombolysis rates of at least 18% look achievable in England and Wales, but each hospital should have its own target.


Subject(s)
Critical Pathways , Stroke , Administration, Intravenous , Fibrinolytic Agents/therapeutic use , Humans , Machine Learning , Stroke/drug therapy , Thrombolytic Therapy/methods
9.
AMRC Open Res ; 3: 19, 2022 Apr 08.
Article in English | MEDLINE | ID: mdl-35726231

ABSTRACT

Introduction: Dysphagia often occurs during Parkinson's disease (PD) and can have severe consequences. Recently, neuromodulatory techniques have been used to treat neurogenic dysphagia. Here we aimed to compare the neurophysiological and swallowing effects of three different types of neurostimulation, 5 Hertz (Hz) repetitive transcranial magnetic stimulation (rTMS), 1 Hz rTMS and pharyngeal electrical stimulation (PES) in patients with PD. Method: 12 PD patients with dysphagia were randomised to receive either 5 Hz rTMS, 1 Hz rTMS, or PES. In a cross-over design, patients were assigned to one intervention and received both real and sham stimulation. Patients received a baseline videofluoroscopic (VFS) assessment of their swallowing, enabling penetration aspiration scores (PAS) to be calculated for: thin fluids, paste, solids and cup drinking. Swallowing timing measurements were also performed on thin fluid swallows only. They then had baseline recordings of motor evoked potentials (MEPs) from both pharyngeal and (as a control) abductor pollicis brevis (APB) cortical areas using single-pulse TMS. Subsequently, the intervention was administered and post interventional TMS recordings were taken at 0 and 30 minutes followed by a repeat VFS within 60 minutes of intervention. Results: All interventions were well tolerated. Due to lower than expected recruitment, statistical analysis of the data was not undertaken. However, with respect to PAS swallowing timings and MEP amplitudes, there was small but visible difference in the outcomes between active and sham. Conclusion: PES, 5 Hz rTMS and 1 Hz rTMS are tolerable interventions in PD related dysphagia. Due to small patient numbers no definitive conclusions could be drawn from the data with respect to individual interventions improving swallowing function and comparative effectiveness between interventions. Larger future studies are needed to further explore the efficacy of these neuromodulatory treatments in Parkinson's Disease associated dysphagia.

10.
Front Immunol ; 13: 834757, 2022.
Article in English | MEDLINE | ID: mdl-35432299

ABSTRACT

Mycobacterium bovis bacille Calmette-Guérin (BCG) has been used for 100 years and prevents disseminated tuberculosis and death in young children. However, it shows only partial efficacy against pulmonary tuberculosis (TB) in adults, so new vaccines are urgently needed. The protective efficacy of BCG depends on T cells, which are typically activated by pathogen-derived protein antigens that bind to highly polymorphic major histocompatibility complex (MHC) molecules. Some T cells recognize non-protein antigens via antigen presenting systems that are independent of genetic background, leading to their designation as donor-unrestricted T (DURT) cells. Whether live whole cell vaccines, like BCG, can induce durable expansions of DURT cells in humans is not known. We used combinatorial tetramer staining, multi-parameter flow cytometry, and immunosequencing to comprehensively characterize the effect of BCG on activation and expansion of DURT cell subsets. We examined peripheral blood mononuclear cells (PBMC) derived from a Phase I study of South African adults in which samples were archived at baseline, 3 weeks, and 52 weeks post-BCG revaccination. We did not observe a change in the frequency of total mucosal-associated invariant T (MAIT) cells, invariant natural killer T (iNKT) cells, germline encoded mycolyl-reactive (GEM) T cells, or γδ T cells at 52 weeks post-BCG. However, immunosequencing revealed a set of TCR-δ clonotypes that were expanded at 52 weeks post-BCG revaccination. These expanded clones expressed the Vδ2 gene segment and could be further defined on the basis of biochemical similarity into several 'meta-clonotypes' that likely recognize similar epitopes. Our data reveal that BCG vaccination leads to durable expansion of DURT cell clonotypes despite a limited effect on total circulating frequencies in the blood and have implications for defining the immunogenicity of candidate whole cell TB vaccines.


Subject(s)
BCG Vaccine , Mycobacterium tuberculosis , Adult , Child , Child, Preschool , Humans , Immunization, Secondary , Leukocytes, Mononuclear , Receptors, Antigen, T-Cell
11.
Nat Commun ; 13(1): 78, 2022 01 10.
Article in English | MEDLINE | ID: mdl-35013257

ABSTRACT

T cells recognize mycobacterial glycolipid (mycolipid) antigens presented by CD1b molecules, but the role of CD4 and CD8 co-receptors in mycolipid recognition is unknown. Here we show CD1b-mycolipid tetramers reveal a hierarchy in which circulating T cells expressing CD4 or CD8 co-receptor stain with a higher tetramer mean fluorescence intensity than CD4-CD8- T cells. CD4+ primary T cells transduced with mycolipid-specific T cell receptors bind CD1b-mycolipid tetramer with a higher fluorescence intensity than CD8+ primary T cells. The presence of either CD4 or CD8 also decreases the threshold for interferon-γ secretion. Co-receptor expression increases surface expression of CD3ε, suggesting a mechanism for increased tetramer binding and activation. Targeted transcriptional profiling of mycolipid-specific T cells from individuals with active tuberculosis reveals canonical markers associated with cytotoxicity among CD8+ compared to CD4+ T cells. Thus, expression of co-receptors modulates T cell receptor avidity for mycobacterial lipids, leading to in vivo functional diversity during tuberculosis disease.


Subject(s)
Antigens, CD1/immunology , Glycolipids/immunology , Mycobacterium tuberculosis/immunology , Tuberculosis/immunology , Antigens, CD1/genetics , CD3 Complex/genetics , CD3 Complex/immunology , CD4-Positive T-Lymphocytes/immunology , CD4-Positive T-Lymphocytes/microbiology , CD8-Positive T-Lymphocytes/immunology , CD8-Positive T-Lymphocytes/microbiology , Cytotoxicity, Immunologic , Gene Expression , Glycolipids/metabolism , Humans , Interferon-gamma/genetics , Interferon-gamma/immunology , Lymphocyte Activation , Mycobacterium tuberculosis/growth & development , Primary Cell Culture , Protein Binding , Protein Multimerization , Transduction, Genetic , Tuberculosis/genetics , Tuberculosis/microbiology
12.
Am J Health Promot ; 36(4): 597-601, 2022 05.
Article in English | MEDLINE | ID: mdl-34939446

ABSTRACT

The Outreach Core of the U54 Partnership between the Dana-Farber/Harvard Cancer Center and the University of Massachusetts Boston created a new model for addressing cancer inequities that integrates implementation science, community-engaged research, and health promotion. Key elements of the approach include engaging a Community Advisory Board, supporting students from underrepresented minority backgrounds to conduct health promotion and community-engaged research, increasing the delivery of evidence-based cancer prevention programs to underserved communities (directly and by training local organizations), supporting research-practice partnerships, and disseminating findings. Our model highlights the need for long-term investments to connect underserved communities with evidence-based cancer prevention.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Health Promotion , Humans , Implementation Science , Massachusetts
13.
Ophthalmic Epidemiol ; 29(4): 353-362, 2022 08.
Article in English | MEDLINE | ID: mdl-34622738

ABSTRACT

PURPOSE: We aimed to review available data on the incidence of herpes simplex virus (HSV) keratitis and other HSV ocular disease and to estimate the global burden of HSV ocular disease. METHODS: We searched Medline and Embase databases to October 2020 for studies reporting on the incidence of HSV ocular disease. Study quality was evaluated using a four-point checklist. Pooled estimates were applied to 2016 population data to estimate global HSV ocular disease burden. Numbers with uniocular vision impairment (any visual acuity <6/12) were estimated by applying published risks to case numbers. RESULTS: Fourteen studies had incidence data; seven met our quality criteria. In 2016, an estimated 1.7 (95% confidence interval, 95% CI 1.0-3.0) million people had HSV keratitis, based on a pooled incidence of 24.0 (95% CI 14.0-41.0; N = 2; I2 = 97.7%) per 100,000 person-years. The majority had epithelial keratitis (pooled incidence 16.1 per 100,000; 95% CI 11.6-22.3; N = 3; I2 = 92.6%). Available studies were few and limited to the USA and Europe. Data were even more limited for HSV uveitis and retinitis, although these conditions may collectively contribute a further >0.1 million cases. Based on global incidence, some 230,000 people may have newly acquired uniocular vision impairment associated with HSV keratitis in 2016. CONCLUSION: Over 1.8 million people may have herpetic eye disease annually. Preventing HSV infection could therefore have an important impact on eye health. Herpetic eye disease burden is likely to have been underestimated, as many settings outside of the USA and Europe have higher HSV-1 prevalence and poorer access to treatment.


Subject(s)
Keratitis, Herpetic , Eye , Humans , Incidence , Keratitis, Herpetic/epidemiology , Prevalence , Simplexvirus
14.
JAMA Netw Open ; 4(12): e2136553, 2021 12 01.
Article in English | MEDLINE | ID: mdl-34913981

ABSTRACT

Importance: Machine learning algorithms could be used as the basis for clinical decision-making aids to enhance clinical practice. Objective: To assess the ability of machine learning algorithms to predict dementia incidence within 2 years compared with existing models and determine the optimal analytic approach and number of variables required. Design, Setting, and Participants: This prognostic study used data from a prospective cohort of 15 307 participants without dementia at baseline to perform a secondary analysis of factors that could be used to predict dementia incidence. Participants attended National Alzheimer Coordinating Center memory clinics across the United States between 2005 and 2015. Analyses were conducted from March to May 2021. Exposures: 258 variables spanning domains of dementia-related clinical measures and risk factors. Main Outcomes and Measures: The main outcome was incident all-cause dementia diagnosed within 2 years of baseline assessment. Results: In a sample of 15 307 participants (mean [SD] age, 72.3 [9.8] years; 9129 [60%] women and 6178 [40%] men) without dementia at baseline, 1568 (10%) received a diagnosis of dementia within 2 years of their initial assessment. Compared with 2 existing models for dementia risk prediction (ie, Cardiovascular Risk Factors, Aging, and Incidence of Dementia Risk Score, and the Brief Dementia Screening Indicator), machine learning algorithms were superior in predicting incident all-cause dementia within 2 years. The gradient-boosted trees algorithm had a mean (SD) overall accuracy of 92% (1%), sensitivity of 0.45 (0.05), specificity of 0.97 (0.01), and area under the curve of 0.92 (0.01) using all 258 variables. Analysis of variable importance showed that only 6 variables were required for machine learning algorithms to achieve an accuracy of 91% and area under the curve of at least 0.89. Machine learning algorithms also identified up to 84% of participants who received an initial dementia diagnosis that was subsequently reversed to mild cognitive impairment or cognitively unimpaired, suggesting possible misdiagnosis. Conclusions and Relevance: These findings suggest that machine learning algorithms could accurately predict incident dementia within 2 years in patients receiving care at memory clinics using only 6 variables. These findings could be used to inform the development and validation of decision-making aids in memory clinics.


Subject(s)
Cognitive Dysfunction/diagnosis , Dementia/diagnosis , Machine Learning/statistics & numerical data , Risk Assessment/methods , Aged , Area Under Curve , Dementia/epidemiology , Disease Progression , Female , Humans , Incidence , Male , Predictive Value of Tests , Prognosis , Prospective Studies , Risk Factors , Sensitivity and Specificity , United States
15.
EClinicalMedicine ; 35: 100876, 2021 May.
Article in English | MEDLINE | ID: mdl-34027335

ABSTRACT

BACKGROUND: Herpes simplex virus type 2 (HSV-2) infection is a prevalent, sexually transmitted infection with a sizable disease burden that is highest in sub-Saharan Africa. This study aimed to characterize HSV-2 epidemiology in this region. METHODS: Cochrane and PRISMA guidelines were followed to systematically review, synthesize, and report HSV-2 related findings up to August 23, 2020. Meta-analyses and meta-regressions were conducted. FINDINGS: From 218 relevant publications, 451 overall outcome measures and 869 stratified measures were extracted. Pooled incidence rates ranged between 2.4-19.4 per 100 person-years across populations. Pooled seroprevalence was lowest at 37.3% (95% confidence interval (CI): 34.9-39.7%) in general populations and high in female sex workers and HIV-positive individuals at 62.5% (95% CI: 54.8-70.0%) and 71.3% (95% CI: 66.5-75.9%), respectively. In general populations, pooled seroprevalence increased steadily with age. Compared to women, men had a lower seroprevalence with an adjusted risk ratio (ARR) of 0.61 (95% CI: 0.56-0.67). Seroprevalence has decreased in recent decades with an ARR of 0.98 (95% CI: 0.97-0.99) per year. Seroprevalence was highest in Eastern and Southern Africa. Pooled HSV-2 proportion in genital ulcer disease was 50.7% (95% CI: 44.7-56.8%) and in genital herpes it was 97.3% (95% CI: 84.4-100%). INTERPRETATION: Seroprevalence is declining by 2% per year, but a third of the population is infected. Age and geography play profound roles in HSV-2 epidemiology. Temporal declines and geographic distribution of HSV-2 seroprevalence mirror that of HIV prevalence, suggesting sexual risk behavior has been declining for three decades. HSV-2 is the etiological cause of half of genital ulcer disease and nearly all genital herpes cases with limited role for HSV-1. FUNDING: This work was supported by pilot funding from the Biomedical Research Program at Weill Cornell Medicine in Qatar and by the Qatar National Research Fund [NPRP 9-040-3-008].

16.
Transl Behav Med ; 11(2): 452-461, 2021 03 16.
Article in English | MEDLINE | ID: mdl-32515481

ABSTRACT

Increasing the use of evidence-based programs (EBPs) in community settings is critical for improving health and reducing disparities. Community-based organizations (CBOs) and faith-based organizations (FBOs) have tremendous reach and trust within underserved communities, but their impact is constrained by limited staff capacity to use EBPs. This exploratory study sought to identify design and delivery considerations that could increase the impact of capacity-building interventions for CBOs and FBOs working with underserved communities. Data come from a community-based participatory research project addressing cancer disparities in Black, Latino, and Brazilian communities from Greater Boston and Greater Lawrence, Massachusetts. We conducted four focus group discussions with program coordinators in CBOs and FBOs (n = 27) and key informant interviews with CBO and FBO leaders (n = 15). Three researchers analyzed the data using a multi-stage coding process that included both prefigured and emergent codes. Key design considerations included embedding customized capacity-building interventions into community networks with local experts, supporting ongoing engagement with the intervention via a range of resources and communication channels, and addressing resource constraints. Regarding the contextual factors that should influence capacity-building intervention content, participants highlighted resource constraints, environments in which EBP use is not the norm, and challenges linking available programs with the multi-level barriers to good health faced by community members. Overall, the study highlights the need for integrated, long-term capacity-building efforts developed in partnership with, and ultimately sustained by, local organizations.


Subject(s)
Faith-Based Organizations , Health Promotion , Capacity Building , Community Networks , Community-Based Participatory Research , Humans
17.
Poult Sci ; 99(11): 5517-5525, 2020 Nov.
Article in English | MEDLINE | ID: mdl-33142470

ABSTRACT

Qualities of the light environment, such as the spectral composition of light, have been shown to impact growth and performance of broiler chickens. UVA light is visible to broiler chickens, whereas UVB wavelengths promote endogenous vitamin D synthesis, which could support their rapid development. The aim of the current study was to investigate the impacts of supplementary UVA and UVB wavelengths on performance indicators of broiler chickens. Day-old Ross 308 chicks (n = 638), reared to a target stocking density of 33 kg/m2, were randomly assigned to 1 of 3 lighting treatments: A) White light emitting diode (LED) and supplementary UVA LED lighting (18-h photoperiod); B) White LED with supplementary UVA and UVB fluorescent lighting providing 30 µW/cm2 UVB at bird level (lights on for 8 h of the total photoperiod to avoid overexposure of UVB); and C) White LED control group, representative of farm conditions (18-h photoperiod). Mortality was recorded, and broiler chickens were individually weighed at 8, 15, 22, 27, and 34 D of age. Generalized linear models and nonlinear mixed effects models (Gompertz curve) were fitted to determine the effects of UV wavelengths on broiler mortality and growth performance. UV did not impact breast or leg weight of broiler chickens but was associated with differences in mortality, growth, and end weight. Broiler chickens provided with UVA for the full 18-h photoperiod had slower initial growth than control broilers and a reduction in mortality. Results from male broilers reared with supplementary UVA + UVB for 8 h indicated they could reach finishing weights sooner than controls, which supports the potential for UVA + B to improve the growth performance of males. Results suggest that the provision of supplementary UVA + UVB wavelengths may improve the performance of male broiler chickens. The reduction in mortality in the UVA only treatment may warrant further investigation. The inclusion of UV wavelengths within poultry lighting regimes represents a promising area of further study.


Subject(s)
Chickens , Growth , Ultraviolet Rays , Animals , Body Weight/radiation effects , Chickens/growth & development , Growth/radiation effects , Male , Photoperiod
18.
Bull World Health Organ ; 98(5): 315-329, 2020 May 01.
Article in English | MEDLINE | ID: mdl-32514197

ABSTRACT

OBJECTIVE: To generate global and regional estimates for the prevalence and incidence of herpes simplex virus (HSV) type 1 and type 2 infection for 2016. METHODS: To obtain data, we undertook a systematic review to identify studies up to August 2018. Adjustments were made to account for HSV test sensitivity and specificity. For each World Health Organization (WHO) region, we applied a constant incidence model to pooled prevalence by age and sex to estimate the prevalence and incidence of HSV types 1 and 2 infections. For HSV type 1, we apportioned infection by anatomical site using pooled estimates of the proportions that were oral and genital. FINDINGS: In 2016, an estimated 491.5 million people (95% uncertainty interval, UI: 430.4 million-610.6 million) were living with HSV type 2 infection, equivalent to 13.2% of the world's population aged 15-49 years. An estimated 3752.0 million people (95% UI: 3555.5 million-3854.6 million) had HSV type 1 infection at any site, equivalent to a global prevalence of 66.6% in 0-49-year-olds. Differing patterns were observed by age, sex and geographical region, with HSV type 2 prevalence being highest among women and in the WHO African Region. CONCLUSION: An estimated half a billion people had genital infection with HSV type 2 or type 1, and several billion had oral HSV type 1 infection. Millions of people may also be at higher risk of acquiring human immunodeficiency virus (HIV), particularly women in the WHO African Region who have the highest HSV type 2 prevalence and exposure to HIV.


Subject(s)
Herpes Simplex/epidemiology , Herpes Simplex/virology , Adolescent , Adult , Age Distribution , Child , Child, Preschool , Female , Global Health , Herpes Genitalis , Herpesvirus 1, Human/isolation & purification , Herpesvirus 2, Human/isolation & purification , Humans , Incidence , Infant , Male , Middle Aged , Prevalence , Young Adult
19.
BMJ Glob Health ; 5(3): e001875, 2020.
Article in English | MEDLINE | ID: mdl-32201620

ABSTRACT

Introduction: Herpes simplex virus (HSV) infection can cause painful, recurrent genital ulcer disease (GUD), which can have a substantial impact on sexual and reproductive health. HSV-related GUD is most often due to HSV type 2 (HSV-2), but may also be due to genital HSV type 1 (HSV-1), which has less frequent recurrent episodes than HSV-2. The global burden of GUD has never been quantified. Here we present the first global and regional estimates of GUD due to HSV-1 and HSV-2 among women and men aged 15-49 years old. Methods: We developed a natural history model reflecting the clinical course of GUD following HSV-2 and genital HSV-1 infection, informed by a literature search for data on model parameters. We considered both diagnosed and undiagnosed symptomatic infection. This model was then applied to existing infection estimates and population sizes for 2016. A sensitivity analysis was carried out varying the assumptions made. Results: We estimated that 187 million people aged 15-49 years had at least one episode of HSV-related GUD globally in 2016: 5.0% of the world's population. Of these, 178 million (95% of those with HSV-related GUD) had HSV-2 compared with 9 million (5%) with HSV-1. GUD burden was highest in Africa, and approximately double in women compared with men. Altogether there were an estimated 8 billion person-days spent with HSV-related GUD globally in 2016, with 99% of days due to HSV-2. Taking into account parameter uncertainty, the percentage with at least one episode of HSV-related GUD ranged from 3.2% to 7.9% (120-296 million). However, the estimates were sensitive to the model assumptions. Conclusion: Our study represents a first attempt to quantify the global burden of HSV-related GUD, which is large. New interventions such as HSV vaccines, antivirals or microbicides have the potential to improve the quality of life of millions of people worldwide.


Subject(s)
Global Health , Herpes Genitalis , Ulcer , Adolescent , Adult , Female , Global Health/statistics & numerical data , Herpes Genitalis/complications , Herpes Genitalis/epidemiology , Herpesvirus 1, Human/isolation & purification , Herpesvirus 2, Human/isolation & purification , Humans , Male , Middle Aged , Models, Biological , Ulcer/epidemiology , Ulcer/virology , Young Adult
20.
Front Immunol ; 11: 170, 2020.
Article in English | MEDLINE | ID: mdl-32117300

ABSTRACT

Diseases due to mycobacteria, including tuberculosis, leprosy, and Buruli ulcer, rank among the top causes of death and disability worldwide. Animal studies have revealed the importance of T cells in controlling these infections. However, the specific antigens recognized by T cells that confer protective immunity and their associated functions remain to be definitively established. T cells that respond to mycobacterial peptide antigens exhibit classical features of adaptive immunity and have been well-studied in humans and animal models. Recently, innate-like T cells that recognize lipid and metabolite antigens have also been implicated. Specifically, T cells that recognize mycobacterial glycolipid antigens (mycolipids) have been shown to confer protection to tuberculosis in animal models and share some biological characteristics with adaptive and innate-like T cells. Here, we review the existing data suggesting that mycolipid-specific T cells exist on a spectrum of "innateness," which will influence how they can be leveraged to develop new diagnostics and vaccines for mycobacterial diseases.


Subject(s)
Antigens, Bacterial/immunology , Glycolipids/immunology , Immunity, Innate , Mycobacterium tuberculosis/immunology , T-Lymphocytes/immunology , Tuberculosis/immunology , Animals , Humans , Immunologic Memory , Leprosy/immunology , Leprosy/microbiology , Lymphocyte Activation/immunology , Phenotype , Receptors, Antigen, T-Cell/immunology , Tuberculosis/microbiology
SELECTION OF CITATIONS
SEARCH DETAIL
...