Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 25
Filter
1.
Sci Rep ; 13(1): 21457, 2023 12 05.
Article in English | MEDLINE | ID: mdl-38052922

ABSTRACT

Social distancing interrupted transmission patterns of contact-driven infectious agents such as norovirus during the Covid-19 pandemic. Since routine surveillance of norovirus was additionally disrupted during the pandemic, traditional naïve forecasts that rely only on past public health surveillance data may not reliably represent norovirus activity. This study investigates the use of statistical modelling to predict the number of norovirus laboratory reports in England 4-weeks ahead of time before and during Covid-19 pandemic thus providing insights to inform existing practices in norovirus surveillance in England. We compare the predictive performance from three forecasting approaches that assume different underlying structure of the norovirus data and utilized various external data sources including mobility, air temperature and relative internet searches (Time Series and Regularized Generalized Linear Model, and Quantile Regression Forest). The performance of each approach was evaluated using multiple metrics, including a relative prediction error against the traditional naive forecast of a five-season mean. Our data suggest that all three forecasting approaches improve predictive performance over the naïve forecasts, especially in the 2020/21 season (30-45% relative improvement) when the number of norovirus reports reduced. The improvement ranged from 7 to 22% before the pandemic. However, performance varied: regularized regression incorporating internet searches showed the best forecasting score pre-pandemic and the time series approach achieved the best results post pandemic onset without external data. Overall, our results demonstrate that there is a significant value for public health in considering the adoption of more sophisticated forecasting tools, moving beyond traditional naïve methods, and utilizing available software to enhance the precision and timeliness of norovirus surveillance in England.


Subject(s)
COVID-19 , Norovirus , Humans , COVID-19/epidemiology , Public Health Surveillance , Pandemics , Seasons , Public Health , Forecasting
2.
J Med Internet Res ; 25: e37540, 2023 05 08.
Article in English | MEDLINE | ID: mdl-37155231

ABSTRACT

BACKGROUND: Norovirus is associated with approximately 18% of the global burden of gastroenteritis and affects all age groups. There is currently no licensed vaccine or available antiviral treatment. However, well-designed early warning systems and forecasting can guide nonpharmaceutical approaches to norovirus infection prevention and control. OBJECTIVE: This study evaluates the predictive power of existing syndromic surveillance data and emerging data sources, such as internet searches and Wikipedia page views, to predict norovirus activity across a range of age groups across England. METHODS: We used existing syndromic surveillance and emerging syndromic data to predict laboratory data indicating norovirus activity. Two methods are used to evaluate the predictive potential of syndromic variables. First, the Granger causality framework was used to assess whether individual variables precede changes in norovirus laboratory reports in a given region or an age group. Then, we used random forest modeling to estimate the importance of each variable in the context of others with two methods: (1) change in the mean square error and (2) node purity. Finally, these results were combined into a visualization indicating the most influential predictors for norovirus laboratory reports in a specific age group and region. RESULTS: Our results suggest that syndromic surveillance data include valuable predictors for norovirus laboratory reports in England. However, Wikipedia page views are less likely to provide prediction improvements on top of Google Trends and Existing Syndromic Data. Predictors displayed varying relevance across age groups and regions. For example, the random forest modeling based on selected existing and emerging syndromic variables explained 60% variance in the ≥65 years age group, 42% in the East of England, but only 13% in the South West region. Emerging data sets highlighted relative search volumes, including "flu symptoms," "norovirus in pregnancy," and norovirus activity in specific years, such as "norovirus 2016." Symptoms of vomiting and gastroenteritis in multiple age groups were identified as important predictors within existing data sources. CONCLUSIONS: Existing and emerging data sources can help predict norovirus activity in England in some age groups and geographic regions, particularly, predictors concerning vomiting, gastroenteritis, and norovirus in the vulnerable populations and historical terms such as stomach flu. However, syndromic predictors were less relevant in some age groups and regions likely due to contrasting public health practices between regions and health information-seeking behavior between age groups. Additionally, predictors relevant to one norovirus season may not contribute to other seasons. Data biases, such as low spatial granularity in Google Trends and especially in Wikipedia data, also play a role in the results. Moreover, internet searches can provide insight into mental models, that is, an individual's conceptual understanding of norovirus infection and transmission, which could be used in public health communication strategies.


Subject(s)
Caliciviridae Infections , Gastroenteritis , Norovirus , Humans , Infodemiology , England/epidemiology , Gastroenteritis/epidemiology , Caliciviridae Infections/epidemiology
3.
Front Vet Sci ; 8: 728691, 2021.
Article in English | MEDLINE | ID: mdl-34790712

ABSTRACT

The precision by which animal diseases are diagnosed affects our ability to make informed decisions with regards to animal health management, from a clinical and economic perspective. Lameness is a major health condition in dairy cattle. The underlying causes of lameness include bovine digital dermatitis (BDD), which is reported as one of the main causes of infectious lameness in dairy cattle. Presently, the gold standard for BDD diagnosis in dairy cattle is visual inspection of lifted hooves-a labour intensive and subjective method. Research has suggested that Treponema spp. are the main pathogens associated with the establishment of BDD. We explored the potential of indirect enzyme-linked immunosorbent assay (ELISA) as a diagnostic serological tool in the identification of cows at different stages of BDD. Additionally, we evaluated the predictive power of this diagnostic tool on the future occurrence of BDD lesions. A total of 232 cows from three farms were used in the study. Serum samples and hoof health data were collected at three time points: ~ 30 days pre-calving, around calving, and approximately 30 days post-calving. The mean absorbance from the ELISA test was compared across different clinical presentations of BDD as assessed by visual inspection of the hooves according to the M-stage classification system. A transition model was developed to estimate the probability of lesion occurrence in time t + 1 based on the spectrophotometer (absorbance) reading in time t. The mean absorbance reading for both IgG1 and IgG2 anti-Treponema antibodies was associated with disease presence-apart from M4.1 lesions, animals with no lesions had a lower mean when compared to animals with lesions regardless of the score. Additionally, the mean absorbance reading of animals with active lesions was higher when compared to animals with no lesions. However, the anti-Treponema antibody assays failed to identify disease presence in a consistent manner. Moreover, indirect ELISA readings were not a predictor of the future occurrence of BDD lesions. In conclusion, although the levels anti-Treponema antibodies were associated with disease presence, the ELISA test failed to detect disease unequivocally and had no predictive value in the future occurrence of BDD lesions.

4.
PLoS One ; 16(8): e0256638, 2021.
Article in English | MEDLINE | ID: mdl-34432849

ABSTRACT

BACKGROUND: The COVID-19 pandemic has impacted surveillance activities for multiple pathogens. Since March 2020, there was a decline in the number of reports of norovirus and Campylobacter recorded by England's national laboratory surveillance system. The aim is to estimate and compare the impact of the COVID-19 pandemic on norovirus and Campylobacter surveillance data in England. METHODS: We utilised two quasi-experimental approaches based on a generalised linear model for sequential count data. The first approach estimates overall impact and the second approach focuses on the impact of specific elements of the pandemic response (COVID-19 diagnostic testing and control measures). The following time series (27, 2015-43, 2020) were used: weekly laboratory-confirmed norovirus and Campylobacter reports, air temperature, conducted Sars-CoV-2 tests and Index of COVID-19 control measures stringency. RESULTS: The period of Sars-CoV-2 emergence and subsequent sustained transmission was associated with persistent reductions in norovirus laboratory reports (p = 0.001), whereas the reductions were more pronounced during pandemic emergence and later recovered for Campylobacter (p = 0.075). The total estimated reduction was 47% - 79% for norovirus (12-43, 2020). The total reduction varied by time for Campylobacter, e.g. 19% - 33% in April, 1% - 7% in August. CONCLUSION: Laboratory reporting of norovirus was more adversely impacted than Campylobacter by the COVID-19 pandemic. This may be partially explained by a comparatively stronger effect of behavioural interventions on norovirus transmission and a relatively greater reduction in norovirus testing capacity. Our study underlines the differential impact a pandemic may have on surveillance of gastrointestinal infectious diseases.


Subject(s)
COVID-19/epidemiology , Caliciviridae Infections/diagnosis , Campylobacter Infections/diagnosis , Laboratories/statistics & numerical data , COVID-19/virology , COVID-19 Testing , Caliciviridae Infections/epidemiology , Caliciviridae Infections/virology , Campylobacter/isolation & purification , Campylobacter Infections/epidemiology , Campylobacter Infections/microbiology , England/epidemiology , Humans , Norovirus/isolation & purification , Pandemics , SARS-CoV-2/isolation & purification
5.
Mater Sci Eng C Mater Biol Appl ; 127: 112200, 2021 Aug.
Article in English | MEDLINE | ID: mdl-34225853

ABSTRACT

Self-assembling peptide hydrogels (SAPH) are a popular biomaterial due to their biocompatibility with a wide range of cell types, synthetic design, structural properties that provide a more accurate 3D microenvironment, and potential for cell- and/or drug-delivery system. Mimicking solid tumors in vitro using hydrogels is one method of testing anti-cancer drug efficacy and observing cancerous cell-ECM interactions within a 3D system. In this study, a SAPH, PeptiGel®Alpha1, was used to model in vitro the 3D breast tumor microenvironment. PeptiGel®Alpha1 is composed of entangled nanofibers with consistent diameter and mechanical properties similar to breast cancer that more accurately mimic the stiffness of breast tumor tissue than Matrigel® or collagen type I. PeptiGel®Alpha1 supported the viability and growth of the breast cancer cell lines MCF-7 and MDA-MB-231 and recapitulated key features of solid tumors such as hypoxia and invasion. MCF-7 cells in the hydrogels formed large spheroids resembling acini, while MDA-MB-231 remained dispersed. When treated with tamoxifen, PeptiGel®Alpha1 acted as a barrier, providing drug penetration geometry similar to that in vivo, providing better prediction of the drug effect. Finally, it was observed that MCF-7 cells engulfed the peptide matrix after 14 days, highlighting a potential use in drug delivery. PeptiGel®Alpha1 is a suitable platform for in vitro modeling of breast cancer.


Subject(s)
Breast Neoplasms , Hydrogels , Breast Neoplasms/pathology , Cell Line, Tumor , Collagen Type I , Disease Progression , Female , Humans , MCF-7 Cells , Peptides , Tumor Microenvironment
6.
Front Public Health ; 9: 630449, 2021.
Article in English | MEDLINE | ID: mdl-33912529

ABSTRACT

COVID-19 has disrupted everyday life worldwide and is the first disease event since the 1918 H1N1 Spanish influenza (flu) pandemic to demand an urgent global healthcare response. There has been much debate on whether the damage of COVID-19 is due predominantly to the pathogen itself or our response to it. We compare SARS-CoV-2 against three other major pandemics (1347 Black Death, 1520's new world smallpox outbreaks, and 1918 Spanish Flu pandemic) over the course of 700 years to unearth similarities and differences in pathogen, social and medical context, human response and behavior, and long-term social and economic impact that should be used to shape COVID-19 decision-making. We conclude that <100 years ago, pandemic disease events were still largely uncontrolled and unexplained. The extensive damage wreaked by historical pandemics on health, economy, and society was a function of pathogen characteristics and lack of public health resources. Though there remain many similarities in patterns of disease spread and response from 1300 onwards, the major risks posed by COVID-19 arise not from the pathogen, but from indirect effects of control measures on health and core societal activities. Our understanding of the epidemiology and effective treatment of this virus has rapidly improved and attention is shifting toward the identification of long-term control strategies that balance consideration of health in at risk populations, societal behavior, and economic impact. Policymakers should use lessons from previous pandemics to develop appropriate risk assessments and control plans for now-endemic COVID-19, and for future pandemics.


Subject(s)
COVID-19 , Influenza A Virus, H1N1 Subtype , Influenza Pandemic, 1918-1919 , Influenza, Human , History, 20th Century , Humans , Influenza, Human/epidemiology , SARS-CoV-2
7.
BMJ Open ; 11(2): e044707, 2021 02 08.
Article in English | MEDLINE | ID: mdl-33558359

ABSTRACT

OBJECTIVES: Reporting of COVID-19 cases, deaths and testing has often lacked context for appropriate assessment of disease burden within risk groups. The research considers how routine surveillance data might provide initial insights and identify risk factors, setting COVID-19 deaths early in the pandemic into context. This will facilitate the understanding of wider consequences of a pandemic from the earliest stage, reducing fear, aiding in accurately assessing disease burden and ensuring appropriate disease mitigation. SETTING: UK, 2020. PARTICIPANTS: The study is a secondary analysis of routine, public domain, surveillance data and information from Office for National Statistics (ONS), National Health Service (NHS) 111 and Public Health England (PHE) on deaths and disease. PRIMARY AND SECONDARY OUTCOME MEASURES: Our principal focus is ONS data on deaths mentioning COVID-19 on the death certificate. We also consider information provided in NHS 111 and PHE data summaries. RESULTS: Deaths with COVID-19 significantly contributed to, yet do not entirely explain, abnormally elevated all-cause mortality in the UK from weeks 12-18 of 2020. Early in the UK epidemic, COVID-19 was the greatest threat to those with underlying illness, rarely endangering people aged under 40 years. COVID-19-related death rates differed by region, possibly reflecting underlying population structure. Risk of COVID-19-related death was greater for healthcare and social care staff and black, Asian and minority ethnic individuals, having allowed for documented risk factors. CONCLUSION: Early contextualisation of public health data is critical to recognising who gets sick, when and why. Understanding at-risk groups facilitates a targeted response considering indirect consequences of society's reaction to a pandemic alongside disease-related impacts. COVID-19-related deaths mainly mirror historical patterns, and excess non-COVID-19-related deaths partly reflect reduced access to and uptake of healthcare during lockdown. Future outbreak response will improve through better understanding of connectivity between disease monitoring systems to aid interpretation of disease risk patterns, facilitating nuanced mitigation measures.


Subject(s)
COVID-19/mortality , Communicable Disease Control/organization & administration , Mortality , Pandemics , Adult , Female , Humans , Male , State Medicine , United Kingdom/epidemiology
8.
Front Vet Sci ; 7: 542, 2020.
Article in English | MEDLINE | ID: mdl-32974403

ABSTRACT

Lameness is a serious concern in the dairy sector, reflecting its high incidence and impact on animal welfare and productivity. Research has provided figures on its frequency using different methodologies, making it difficult to compare results and hindering farm-level decision-making. The study's objectives were to determine the frequency levels of lameness in British dairy cattle through a meta-analysis approach, and to understand the chronological patterns of how lameness cases are detected and classified in scientific research. A systematic review was conducted using PRISMA-P guidelines for article selection. Random-effects models estimated the pooled frequency measure of lameness with heterogeneity managed through subgroup analysis and meta-regression. Sixty-eight papers were identified, 50 included prevalence and 36 incidence data. The pooled prevalence of lameness in British dairy cattle was estimated at 29.5% (95% CI 26.7-32.4%) whilst all-cause lameness incidence rate indicated 30.9 cases of lameness per 100 cow-years (95% CI 24.5-37.9). The pooled cause-specific lameness incidence rate per 100 cow-years was 66.1 (95% CI 24.1-128.8) for white line disease, 53.2 (95% CI 20.5-101.2) for sole ulcer, 53.6 (95% CI 19.2-105.34) for digital dermatitis, with 51.9 (95% CI 9.3-129.2) attributable to other lameness-related lesions. Heterogeneity levels remained high. Sixty-nine papers contributed to a chronological overview of lameness data source. Although the AHDB Dairy mobility scoring system (MSS) was launched in the UK in 2008 and adopted shortly after by the British Dairy sector as the standard tool for assessing lameness, other methods are used depending on the investigator. Automated lameness detection systems may offer a solution for the subjective nature of MSSs, yet it was utilized in one study only. Despite the recognition of under-reporting of lameness from farm records 22 (31.9%) studies used this data source. The diversity of lameness data collection methods and sources was a key finding. It limits the understanding of lameness burden and the refinement of policy making for lameness. Standardizing case definition and research methods would improve knowledge of and ability to manage lameness. Regardless of the measurement method lameness in British dairy cattle is high.

9.
BMC Infect Dis ; 19(1): 12, 2019 Jan 05.
Article in English | MEDLINE | ID: mdl-30611217

ABSTRACT

BACKGROUND: Outbreaks of infectious gastroenteritis in care homes are common, with norovirus a frequent cause. In England there is no co-ordinated national surveillance system. We aimed to estimate the burden of these outbreaks. METHODS: Using a generalised linear mixed effects regression model we described the relationship between the observed number of care home outbreaks and covariates. Estimated model parameters were used to infer uplift in the number of outbreaks expected if all areas were subjected to enhanced surveillance. From this we then estimated the total burden of care home gastroenteritis outbreaks in this period. RESULTS: We estimated a total of 14,146 care home gastroenteritis outbreaks in England during 2014-2016; this is 47% higher than the reported total and a rate of 32.4 outbreaks per 100 care homes per year. The median number of outbreaks from the model estimates was 31 (IQR 20-46) compared to 19 (IQR 12-34) reported from routine surveillance. CONCLUSIONS: This estimated care home gastroenteritis burden in England indicates that current surveillance substantially underestimates the number of outbreaks, by almost half. Improving this surveillance could provide better epidemiological knowledge of the burden of norovirus to inform public health policy, particularly with the advent of norovirus vaccines.


Subject(s)
Caliciviridae Infections/epidemiology , Disease Outbreaks/statistics & numerical data , Gastroenteritis/epidemiology , Nursing Homes/statistics & numerical data , Aged , Aged, 80 and over , Child , Child, Preschool , England/epidemiology , Gastroenteritis/virology , Hospitalization/statistics & numerical data , Humans , Infant , Infant, Newborn , Norovirus/isolation & purification , Population Surveillance , Residence Characteristics/statistics & numerical data , Schools/statistics & numerical data
10.
Foodborne Pathog Dis ; 15(10): 589-597, 2018 10.
Article in English | MEDLINE | ID: mdl-30109958

ABSTRACT

Norovirus (NoV) is the commonest cause of gastrointestinal disease in the United Kingdom and in many developed countries, causing diarrhea and vomiting in millions of cases worldwide annually. Transmission is most often mediated from person to person. NoV infection has, however, additionally been associated with the consumption of food, either through the consumption of food contaminated at source such as seafood, berries, and salad, or as a consequence of the foodstuff being contaminated in some way by a food handler during processing or serving. A systematic review of outbreaks attributed to NoV between January 2003 and July 2017 was conducted to assess the contribution of food handlers to the burden of NoV, and to identify foods commonly associated with NoV outbreaks. A total of 3021 articles were screened, of which 27 met the definition of confirmed foodborne outbreaks and 47 met the criteria for definite food-handler NoV outbreaks. Of all food types, shellfish were implicated in the greatest number of definite foodborne outbreaks. Food handlers contributed to definite food-handler outbreaks involving a diverse range of foodstuffs and in a wide variety of settings, including weddings and military establishments. More genotypes of NoV were found in people who were ill than in samples from food and food handlers. The potential for both food products and food handlers to contribute to the burden of NoV infection is demonstrated conclusively.


Subject(s)
Caliciviridae Infections/epidemiology , Disease Outbreaks , Food Handling , Foodborne Diseases/epidemiology , Gastroenteritis/epidemiology , Norovirus/isolation & purification , Food Contamination/analysis , Food Microbiology , Foodborne Diseases/virology , Gastroenteritis/virology , Humans , Odds Ratio , RNA, Viral/genetics , Shellfish/virology
11.
BMC Vet Res ; 14(1): 185, 2018 Jun 15.
Article in English | MEDLINE | ID: mdl-29907108

ABSTRACT

BACKGROUND: Liver fluke infection caused by the parasite Fasciola hepatica is a major cause of production losses to the cattle industry in the UK. To investigate farm-level risk factors for fluke infection, a randomised method to recruit an appropriate number of herds from a defined geographical area into the study was required. The approach and hurdles that were encountered in designing and implementing this study are described. The county of Shropshire, England, was selected for the study because of the variation between farms in exposure to fluke infection observed in an earlier study. RESULTS: From a sampling list of 569 holdings in Shropshire randomly drawn from the RADAR cattle population dataset, 396 (69.6%) holdings were successfully contacted by telephone and asked if they would be interested in taking part in the study. Of 296 farmers who agreed to receive information packs by post, 195 (65.9%) agreed to take part in the study. Over the period October 2014 - April 2015 visits were made to 100 dairy and 95 non-dairy herds. During the farm visits 40 faecal samples +/- bulk-tank milk samples were collected and a questionnaire administered. Composite faecal samples were analysed for the presence of F. hepatica eggs by sedimentation and bulk tank milk samples were tested with an antibody ELISA for F. hepatica. Forty-five (49%) of non-dairy herds were positive for liver fluke infection as determined by the finding of one or more fluke eggs, while 36 (36%) dairy herds had fluke positive faecal samples and 41 (41%) dairy herds were positive for F. hepatica antibody. Eighty-seven (45.8%) farmers said that they monitored their cattle for liver fluke infection and 118 (62.1%) reported that they used flukicide drugs in their cattle. CONCLUSIONS: Using a protocol of contacting farmers directly by telephone and subsequently sending information by post, 79% of the target sample size was successfully recruited into the study. A dataset of farm-specific information on possible risk factors for liver fluke infection and corresponding liver-fluke infection status was generated for the development of statistical models to identify risk factors for liver fluke infection at the farm-level.


Subject(s)
Cattle Diseases/diagnosis , Clinical Protocols , Fascioliasis/veterinary , Animals , Cattle , Dairying , Datasets as Topic , Fasciola hepatica , Fascioliasis/diagnosis , Female , Interviews as Topic , Male , Patient Selection , Random Allocation , Surveys and Questionnaires
13.
Infect Immun ; 86(1)2018 01.
Article in English | MEDLINE | ID: mdl-28993458

ABSTRACT

Fasciola hepatica is a parasitic trematode of global importance in livestock. Control strategies reliant on anthelmintics are unsustainable due to the emergence of drug resistance. Vaccines are under development, but efficacies are variable. Evidence from experimental infection suggests that vaccine efficacy may be affected by parasite-induced immunomodulation. Little is known about the immune response to F. hepatica following natural exposure. Hence, we analyzed the immune responses over time in calves naturally exposed to F. hepatica infection. Cohorts of replacement dairy heifer calves (n = 42) with no prior exposure to F. hepatica, on three commercial dairy farms, were sampled over the course of a grazing season. Exposure was determined through an F. hepatica-specific serum antibody enzyme-linked immunosorbent assay (ELISA) and fluke egg counts. Concurrent changes in peripheral blood leukocyte subpopulations, lymphocyte proliferation, and cytokine responses were measured. Relationships between fluke infection and immune responses were analyzed by using multivariable linear mixed-effect models. All calves from one farm showed evidence of exposure, while cohorts from the remaining two farms remained negative over the grazing season. A type 2 immune response was associated with exposure, with increased interleukin-4 (IL-4) production, IL-5 transcription, and eosinophilia. Suppression of parasite-specific peripheral blood mononuclear cell (PBMC) proliferation was evident, while decreased mitogen-stimulated gamma interferon (IFN-γ) production suggested immunomodulation, which was not restricted to parasite-specific responses. Our findings show that the global immune response is modulated toward a nonproliferative type 2 state following natural challenge with F. hepatica This has implications in terms of the timing of the administration of vaccination programs and for host susceptibility to coinfecting pathogens.


Subject(s)
Cattle Diseases/immunology , Cattle Diseases/parasitology , Cell Proliferation/physiology , Fasciola hepatica/immunology , Leukocytes, Mononuclear/immunology , Leukocytes, Mononuclear/parasitology , Animals , Anthelmintics/immunology , Antibodies, Helminth/immunology , Antigens, Helminth/immunology , Cattle , Drug Resistance/immunology , Egg Hypersensitivity/immunology , Feces/parasitology , Female , Interleukin-4/immunology , Interleukin-5/immunology , Parasite Egg Count/methods
14.
Prev Vet Med ; 122(1-2): 145-53, 2015 Nov 01.
Article in English | MEDLINE | ID: mdl-26431926

ABSTRACT

Avian Influenza (AI) and Newcastle disease (ND) are the most important reportable poultry diseases worldwide. Low pathogenic AI (H9N2) and ND viruses are known to have been circulating in the Middle East, including in Oman, for many decades. However, detailed information on the occurrence of these pathogens is almost completely lacking in Oman. As backyard poultry are not vaccinated against either virus in Oman, this sector is likely to be the most affected poultry production sector for both diseases. Here, in the first survey of AI and ND viruses in backyard poultry in Oman, we report high flock-level seroprevalences of both viruses. Serum and oropharyngeal swabs were taken from 2350 birds in 243 backyard flocks from all regions and governorates of Oman. Information was recorded on location, type of bird and housing type for each sampled farm. Individual bird serum samples were tested using commercial indirect antibody detection ELISA kits. Pooled oropharyngeal samples from each flock were inoculated onto FTA cards and tested by RT-PCR. Samples came from chickens (90.5%), turkeys (2.1%), ducks (6.2%), guinea fowl (0.8%) and geese (0.4%). The bird-level seroprevalence of antibody to AI and ND viruses was 37.5% and 42.1% respectively, and at the flock level it was 84% and 90% respectively. There were statistically significant differences between some different regions of Oman in the seroprevalence of both viruses. Flock-level NDV seropositivity in chickens was significantly associated with AIV seropositivity, and marginally negatively associated with flock size. AIV seropositivity in chickens was marginally negatively associated with altitude. All oropharyngeal samples were negative for both viruses by RT-PCR, consistent with a short duration of infection. This study demonstrates that eight or nine out of ten backyard poultry flocks in Oman are exposed to AI and ND viruses, and may present a risk for infection for the commercial poultry sector in Oman, or wild birds which could carry infection further afield.


Subject(s)
Influenza A Virus, H9N2 Subtype/isolation & purification , Influenza in Birds/epidemiology , Newcastle Disease/epidemiology , Newcastle disease virus/isolation & purification , Poultry Diseases/epidemiology , Animals , Female , Influenza in Birds/virology , Newcastle Disease/virology , Oman/epidemiology , Poultry , Poultry Diseases/virology , Prevalence , Risk Factors , Seroepidemiologic Studies
15.
PLoS One ; 8(6): e66054, 2013.
Article in English | MEDLINE | ID: mdl-23840399

ABSTRACT

Salmonella spp are a major foodborne zoonotic cause of human illness. Consumption of pork products is believed to be a major source of human salmonellosis and Salmonella control throughout the food-chain is recommended. A number of on-farm interventions have been proposed, and some have been implemented in order to try to achieve Salmonella control. In this study we utilize previously developed models describing Salmonella dynamics to investigate the potential effects of a range of these on-farm interventions. As the models indicated that the number of bacteria shed in the faeces of an infectious animal was a key factor, interventions applied within a high-shedding scenario were also analysed. From simulation of the model, the probability of infection after Salmonella exposure was found to be a key driver of Salmonella transmission. The model also highlighted that minimising physiological stress can have a large effect but only when shedding levels are not excessive. When shedding was high, weekly cleaning and disinfection was not effective in Salmonella control. However it is possible that cleaning may have an effect if conducted more often. Furthermore, separating infectious animals, shedding bacteria at a high rate, from the rest of the population was found to be able to minimise the spread of Salmonella.


Subject(s)
Computer Simulation , Models, Biological , Salmonella Infections, Animal/prevention & control , Swine Diseases/prevention & control , Animal Husbandry , Animals , Bacterial Shedding , Disinfection , Feces , Housing, Animal , Prevalence , Salmonella , Salmonella Infections, Animal/epidemiology , Salmonella Infections, Animal/transmission , Stress, Physiological , Sus scrofa/microbiology , Sus scrofa/physiology , Swine , Swine Diseases/epidemiology , Swine Diseases/transmission , United Kingdom/epidemiology
16.
Math Biosci ; 245(2): 148-56, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23796599

ABSTRACT

A multi-group semi-stochastic model is formulated to describe Salmonella dynamics on a pig herd within the UK and assess whether farm structure has any effect on the dynamics. The models include both direct transmission and indirect (via free-living infectious units in the environment and airborne infection). The basic reproduction number R0 is also investigated. The models estimate approximately 24.6% and 25.4% of pigs at slaughter weight will be infected with Salmonella within a slatted-floored and solid-floored unit respectively, which corresponds to values found in previous abattoir and farm studies, suggesting that the model has reasonable validity. Analysis of the models identified the shedding rate to be of particular importance in the control of Salmonella spread, a finding also evident in an increase in the R0 value.


Subject(s)
Models, Biological , Salmonella Infections, Animal/transmission , Swine Diseases/transmission , Abattoirs , Animal Husbandry/instrumentation , Animals , Basic Reproduction Number , Computational Biology , Markov Chains , Salmonella Infections, Animal/epidemiology , Salmonella Infections, Animal/prevention & control , Stochastic Processes , Sus scrofa , Swine , Swine Diseases/epidemiology , Swine Diseases/prevention & control , United Kingdom/epidemiology
17.
Pediatrics ; 126(4): e946-53, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20837597

ABSTRACT

BACKGROUND: Dietary nucleotides are nonprotein nitrogenous compounds that are found in high concentrations in breast milk and are thought to be conditionally essential nutrients in infancy. A high nucleotide intake has been suggested to explain some of the benefits of breastfeeding compared with formula feeding and to promote infant growth. However, relatively few large-scale randomized trials have tested this hypothesis in healthy infants. OBJECTIVE: We tested the hypothesis that nucleotide supplementation of formula benefits early infant growth. PATIENTS AND METHODS: Occipitofrontal head circumference, weight, and length were assessed in infants who were randomly assigned to groups fed nucleotide-supplemented (31 mg/L; n=100) or control formula without nucleotide supplementation (n=100) from birth to the age of 20 weeks, and in infants who were breastfed (reference group; n=101). RESULTS: Infants fed with nucleotide-supplemented formula had greater occipitofrontal head circumference at ages 8, 16, and 20 weeks than infants fed control formula (mean difference in z scores at 8 weeks: 0.4 [95% confidence interval: 0.1-0.7]; P=.006) even after adjustment for potential confounding factors (P=.002). Weight at 8 weeks and the increase in both occipitofrontal head circumference and weight from birth to 8 weeks were also greater in infants fed nucleotide-supplemented formula than in those fed control formula. CONCLUSIONS: Our data support the hypothesis that nucleotide supplementation leads to increased weight gain and head growth in formula-fed infants. Therefore, nucleotides could be conditionally essential for optimal infant growth in some formula-fed populations. Additional research is needed to test the hypothesis that the benefits of nucleotide supplementation for early head growth, a critical period for brain growth, have advantages for long-term cognitive development.


Subject(s)
Growth , Infant Formula , Nucleotides/administration & dosage , Head/growth & development , Humans , Infant , Infant, Newborn , Weight Gain
18.
Prev Vet Med ; 89(1-2): 67-74, 2009 May 01.
Article in English | MEDLINE | ID: mdl-19303153

ABSTRACT

Salmonella spp. are important food-borne pathogens. Abattoir studies demonstrated that almost a quarter of British finisher pigs might carry Salmonella, which led to the introduction by the British Pig Executive of their Zoonoses Action Plan (ZAP) to monitor the Salmonella status of United Kingdom pig farms by testing meat juice samples using an ELISA system. We used the K-function and approaches from the field of geostatistics to study routine data from ZAP. We demonstrated that there is statistical evidence that geographically localized anomalies of Salmonella infection were present in one of three regions studied. The physical mechanisms underlying this structure remain unclear: spatial structure might be present as a result of shared spatially structured (second-order) or non-spatially structured (first-order) risk factors, transmission processes, or a combination of both. We have demonstrated a way to use routinely collected surveillance data to enhance the knowledge of spatial disease epidemiology.


Subject(s)
Salmonella Infections, Animal/epidemiology , Salmonella Infections, Animal/transmission , Swine Diseases/epidemiology , Swine Diseases/transmission , Zoonoses , Abattoirs , Animals , Enzyme-Linked Immunosorbent Assay/veterinary , Exudates and Transudates/microbiology , Female , Humans , Male , Prevalence , Risk Factors , Salmonella Food Poisoning/prevention & control , Sentinel Surveillance/veterinary , Space-Time Clustering , Swine , United Kingdom/epidemiology
19.
Int J Food Microbiol ; 131(2-3): 95-105, 2009 May 31.
Article in English | MEDLINE | ID: mdl-19232769

ABSTRACT

Milk sold as pasteurized has historically been implicated in the UK and worldwide as a vehicle for outbreaks of food-borne gastrointestinal disease, with a number of causative pathogenic organisms. One such organism is verocytotoxigenic Escherichia coli, or E. coli, O157 (VTEC O157). We present a quantitative assessment of likely exposure to VTEC O157 via milk sold as pasteurized in the UK. Particular interest in our assessment concerns whether there is any differential risk between milk which is processed in on- and off-farm dairies. We model the milk production chain from the farm through to the point of retail and make a comparison between these two production environments. Our model is an example of the Modular Process Risk Modelling (MPRM) approach and represents uncertainty and variability in input parameters using probability distributions. We conclude that milk processed on farm poses the comparatively greater risk, although that risk is still small.


Subject(s)
Dairying , Environmental Exposure , Escherichia coli O157 , Food Handling , Food Microbiology , Food Preservation/methods , Milk/microbiology , Animals , Cattle , Computer Simulation , Escherichia coli O157/isolation & purification , Humans , Models, Biological , Probability , Risk Assessment
20.
Environ Health ; 8 Suppl 1: S19, 2009 Dec 21.
Article in English | MEDLINE | ID: mdl-20102586

ABSTRACT

BACKGROUND: The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. METHODS: We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs). RESULTS: We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5). The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11). In the second case study the effective number of inputs was reduced from 30 to 7 in the screening stage, and just 2 inputs were found to explain 82.8% of the output variance. A combined total of 500 runs of the computer code were used. CONCLUSION: These case studies illustrate the use of Bayesian statistics to perform detailed uncertainty and sensitivity analyses, integrating multiple information sources in a way that is both rigorous and efficient.


Subject(s)
Escherichia coli Infections/epidemiology , Foodborne Diseases/epidemiology , Shiga-Toxigenic Escherichia coli , Animals , Bayes Theorem , Case-Control Studies , Cattle , Cohort Studies , Computer Simulation , Humans , Milk/microbiology , Milk/poisoning , Monte Carlo Method , Risk Assessment/methods , Shiga-Toxigenic Escherichia coli/isolation & purification , United Kingdom/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL
...