Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
1.
Prev Med ; 177: 107774, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37992976

ABSTRACT

Installation of technologies to remove or deactivate respiratory pathogens from indoor air is a plausible non-pharmaceutical infectious disease control strategy. OBJECTIVE: We undertook a systematic review of worldwide observational and experimental studies, published 1970-2022, to synthesise evidence about the effectiveness of suitable indoor air treatment technologies to prevent respiratory or gastrointestinal infections. METHODS: We searched for data about infection and symptom outcomes for persons who spent minimum 20 h/week in shared indoor spaces subjected to air treatment strategies hypothesised to change risk of respiratory or gastrointestinal infections or symptoms. RESULTS: Pooled data from 32 included studies suggested no net benefits of air treatment technologies for symptom severity or symptom presence, in absence of confirmed infection. Infection incidence was lower in three cohort studies for persons exposed to high efficiency particulate air filtration (RR 0.4, 95%CI 0.28-0.58, p < 0.001) and in one cohort study that combined ionisers with electrostatic nano filtration (RR 0.08, 95%CI 0.01-0.60, p = 0.01); other types of air treatment technologies and air treatment in other study designs were not strongly linked to fewer infections. The infection outcome data exhibited strong publication bias. CONCLUSIONS: Although environmental and surface samples are reduced after air treatment by several air treatment strategies, especially germicidal lights and high efficiency particulate air filtration, robust evidence has yet to emerge that these technologies are effective at reducing respiratory or gastrointestinal infections in real world settings. Data from several randomised trials have yet to report and will be welcome to the evidence base.


Subject(s)
Respiratory Tract Infections , Humans , Cohort Studies , Respiratory Tract Infections/prevention & control
2.
Lancet Public Health ; 8(11): e850-e858, 2023 11.
Article in English | MEDLINE | ID: mdl-37832574

ABSTRACT

BACKGROUND: During the COVID-19 pandemic, cases were tracked using multiple surveillance systems. Some systems were completely novel, and others incorporated multiple data streams to estimate case incidence and prevalence. How well these different surveillance systems worked as epidemic indicators is unclear, which has implications for future disease surveillance and outbreak management. The aim of this study was to compare case counts, prevalence and incidence, timeliness, and comprehensiveness of different COVID-19 surveillance systems in England. METHODS: For this retrospective observational study of COVID-19 surveillance systems in England, data from 12 surveillance systems were extracted from publicly available sources (Jan 1, 2020-Nov 30, 2021). The main outcomes were correlations between different indicators of COVID-19 incidence or prevalence. These data were integrated as daily time-series and comparisons undertaken using Spearman correlation between candidate alternatives and the most timely (updated daily, clinical case register) and the least biased (from comprehensive household sampling) COVID-19 epidemic indicators, with comparisons focused on the period of Sept 1, 2020-Nov 30, 2021. FINDINGS: Spearman statistic correlations during the full focus period between the least biased indicator (from household surveys) and other epidemic indicator time-series were 0·94 (95% CI 0·92 to 0·95; clinical cases, the most timely indicator), 0·92 (0·90 to 0·94; estimates of incidence generated after incorporating information about self-reported case status on the ZoeApp, which is a digital app), 0·67 (95% CI 0·60 to 0·73, emergency department attendances), 0·64 (95% CI 0·60 to 0·68, NHS 111 website visits), 0·63 (95% CI 0·56 to 0·69, wastewater viral genome concentrations), 0·60 (95% CI 0·52 to 0·66, admissions to hospital with positive COVID-19 status), 0·45 (95% CI 0·36 to 0·52, NHS 111 calls), 0·08 (95% CI -0·03 to 0·18, Google search rank for "covid"), -0·04 (95% CI -0·12 to 0·05, in-hours consultations with general practitioners), and -0·37 (95% CI -0·46 to -0·28, Google search rank for "coronavirus"). Time lags (-14 to +14 days) did not markedly improve these rho statistics. Clinical cases (the most timely indicator) captured a more consistent proportion of cases than the self-report digital app did. INTERPRETATION: A suite of monitoring systems is useful. The household survey system was the most comprehensive and least biased epidemic monitor, but not very timely. Data from laboratory testing, the self-reporting digital app, and attendances to emergency departments were comparatively useful, fairly accurate, and timely epidemic trackers. FUNDING: National Institute for Health and Care Research Health Protection Research Unit in Emergency Preparedness and Response, a partnership between the UK Health Security Agency, King's College London, and the University of East Anglia.


Subject(s)
COVID-19 , Humans , COVID-19/epidemiology , Pandemics/prevention & control , England/epidemiology , Retrospective Studies , London
3.
Sci Total Environ ; 892: 164441, 2023 Sep 20.
Article in English | MEDLINE | ID: mdl-37245822

ABSTRACT

Some types of poultry bedding made from recycled materials have been reported to contain environmental contaminants such as polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs, dioxins), polychlorinated biphenyls (PCBs) brominated flame retardants (BFRs) polychlorinated naphthalenes (PCNs), polybrominated dioxins (PBDD/Fs), perfluoroalkyl substances (PFAS), etc. In one of the first studies of its kind, the uptake of these contaminants by chicken muscle tissue, liver, and eggs from three types of recycled, commercially available bedding material was simultaneously investigated using conventional husbandry to raise day old chickens to maturity. A weight of evidence analysis showed that PCBs, polybrominated diphenylethers (PBDEs), PCDD/Fs, PCNs and PFAS displayed the highest potential for uptake which varied depending on the type of bedding material used. During the first three to four months of laying, an increasing trend was observed in the concentrations of ΣTEQ (summed toxic equivalence of PCDD/Fs, PCBs, PBDD/Fs, PCNs and polybrominated biphenyls), NDL-PCBs and PBDEs in the eggs of chickens raised on shredded cardboard. Further analysis using bio-transfer factors (BTFs) when egg production reached a steady state, revealed that some PCB congeners (PCBs 28, 81, 138, 153 and 180) irrespective of molecular configuration or chlorine number, showed the highest tendency for uptake. Conversely, BTFs for PBDEs showed good correlation with bromine number, increasing to a maximum value for BDE-209. This relationship was reversed for PCDFs (and to some extent for PCDDs) with tetra- and penta- chlorinated congeners showing a greater tendency for selective uptake. The overall patterns were consistent, although some variability in BTF values was observed between tested materials which may relate to differences in bioavailability. The results indicate a potentially overlooked source of food chain contamination as other livestock products (cow's milk, lamb, beef, duck, etc.) could be similarly impacted.


Subject(s)
Dioxins , Fluorocarbons , Polychlorinated Biphenyls , Polychlorinated Dibenzodioxins , Female , Cattle , Animals , Sheep , Dioxins/analysis , Polychlorinated Biphenyls/analysis , Chickens , Polychlorinated Dibenzodioxins/analysis , Dibenzofurans/analysis , Halogenated Diphenyl Ethers/analysis , Fluorocarbons/analysis , Dibenzofurans, Polychlorinated/analysis , Environmental Monitoring
4.
Ann Epidemiol ; 82: 66-76.e6, 2023 06.
Article in English | MEDLINE | ID: mdl-37001627

ABSTRACT

PURPOSE: Most index cases with novel coronavirus infections transmit disease to just one or two other individuals, but some individuals "super-spread"-they infect many secondary cases. Understanding common factors that super-spreaders may share could inform outbreak models, and be used to guide contact tracing during outbreaks. METHODS: We searched in MEDLINE, Scopus, and preprints to identify studies about people documented as transmitting pathogens that cause SARS, MERS, or COVID-19 to at least nine other people. We extracted data to describe them by age, sex, location, occupation, activities, symptom severity, any underlying conditions, disease outcome and undertook quality assessment for outbreaks published by June 2021. RESULTS: The most typical super-spreader was a male age 40+. Most SARS or MERS super-spreaders were very symptomatic, the super-spreading occurred in hospital settings and frequently the individual died. In contrast, COVID-19 super-spreaders often had very mild disease and most COVID-19 super-spreading happened in community settings. CONCLUSIONS: SARS and MERS super-spreaders were often symptomatic, middle- or older-age adults who had a high mortality rate. In contrast, COVID-19 super-spreaders tended to have mild disease and were any adult age. More outbreak reports should be published with anonymized but useful demographic information to improve understanding of super-spreading, super-spreaders, and the settings in which super-spreading happens.


Subject(s)
COVID-19 , Adult , Male , Humans , COVID-19/epidemiology , SARS-CoV-2 , Disease Outbreaks
5.
Sci Rep ; 13(1): 3893, 2023 03 23.
Article in English | MEDLINE | ID: mdl-36959189

ABSTRACT

Vibrio vulnificus is an opportunistic bacterial pathogen, occurring in warm low-salinity waters. V. vulnificus wound infections due to seawater exposure are infrequent but mortality rates are high (~ 18%). Seawater bacterial concentrations are increasing but changing disease pattern assessments or climate change projections are rare. Here, using a 30-year database of V. vulnificus cases for the Eastern USA, changing disease distribution was assessed. An ecological niche model was developed, trained and validated to identify links to oceanographic and climate data. This model was used to predict future disease distribution using data simulated by seven Global Climate Models (GCMs) which belong to the newest Coupled Model Intercomparison Project (CMIP6). Risk was estimated by calculating the total population within 200 km of the disease distribution. Predictions were generated for different "pathways" of global socioeconomic development which incorporate projections of greenhouse gas emissions and demographic change. In Eastern USA between 1988 and 2018, V. vulnificus wound infections increased eightfold (10-80 cases p.a.) and the northern case limit shifted northwards 48 km p.a. By 2041-2060, V. vulnificus infections may expand their current range to encompass major population centres around New York (40.7°N). Combined with a growing and increasingly elderly population, annual case numbers may double. By 2081-2100 V. vulnificus infections may be present in every Eastern USA State under medium-to-high future emissions and warming. The projected expansion of V. vulnificus wound infections stresses the need for increased individual and public health awareness in these areas.


Subject(s)
Vibrio Infections , Vibrio vulnificus , Wound Infection , Humans , Aged , Vibrio Infections/epidemiology , North America
6.
Sci Total Environ ; 765: 142787, 2021 Apr 15.
Article in English | MEDLINE | ID: mdl-33246727

ABSTRACT

Many types of bioresource materials are beneficially recycled in agriculture for soil improvement and as alternative bedding materials for livestock, but they also potentially transfer contaminants into plant and animal foods. Representative types of industrial and municipal bioresources were selected to assess the extent of organic chemical contamination, including: (i) land applied materials: treated sewage sludge (biosolids), meat and bone meal ash (MBMA), poultry litter ash (PLA), paper sludge ash (PSA) and compost-like-output (CLO), and (ii) bedding materials: recycled waste wood (RWW), dried paper sludge (DPS), paper sludge ash (PSA) and shredded cardboard. The materials generally contained lower concentrations of polychlorinated dibenzo-p-dioxins/dibenzofurans (PCDD/Fs) and dioxin-like polychlorinated biphenyls (PCBs) relative to earlier reports, indicating the decline in environmental emissions of these established contaminants. However, concentrations of polycyclic aromatic hydrocarbons (PAHs) remain elevated in biosolids samples from urban catchments. Polybrominated dibenzo-p-dioxins/dibenzofurans (PBDD/Fs) were present in larger amounts in biosolids and CLO compared to their chlorinated counterparts and hence are of potentially greater significance in contemporary materials. The presence of non-ortho-polychlorinated biphenyls (PCBs) in DPS was probably due to non-legacy sources of PCBs in paper production. Flame retardent chemicals were one of the most significant and extensive groups of contaminants found in the bioresource materials. Decabromodiphenylether (deca-BDE) was the most abundant polybrominated diphenyl ether (PBDE) and may explain the formation and high concentrations of PBDD/Fs detected. Emerging flame retardant compounds, including: decabromodiphenylethane (DBDPE) and organophosphate flame retardants (OPFRs), were also detected in several of the materials. The profile of perfluoroalkyl substances (PFAS) depended on the type of waste category; perfluoroundecanoic acid (PFUnDA) was the most significant PFAS for DPS, whereas perfluorooctane sulfonate (PFOS) was dominant in biosolids and CLO. The concentrations of polychlorinated alkanes (PCAs) and di-2-ethylhexyl phthalate (DEHP) were generally much larger than the other contaminants measured, indicating that there are major anthropogenic sources of these potentially hazardous chemicals entering the environment. The study results suggest that continued vigilance is required to control emissions and sources of these contaminants to support the beneficial use of secondary bioresource materials.


Subject(s)
Polychlorinated Biphenyls , Polychlorinated Dibenzodioxins , Agriculture , Animals , Dibenzofurans , Environmental Monitoring , Polychlorinated Biphenyls/analysis , Polychlorinated Dibenzodioxins/analysis , United Kingdom
7.
Euro Surveill ; 25(49)2020 12.
Article in English | MEDLINE | ID: mdl-33303066

ABSTRACT

BackgroundEvidence for face-mask wearing in the community to protect against respiratory disease is unclear.AimTo assess effectiveness of wearing face masks in the community to prevent respiratory disease, and recommend improvements to this evidence base.MethodsWe systematically searched Scopus, Embase and MEDLINE for studies evaluating respiratory disease incidence after face-mask wearing (or not). Narrative synthesis and random-effects meta-analysis of attack rates for primary and secondary prevention were performed, subgrouped by design, setting, face barrier type, and who wore the mask. Preferred outcome was influenza-like illness. Grading of Recommendations, Assessment, Development and Evaluations (GRADE) quality assessment was undertaken and evidence base deficits described.Results33 studies (12 randomised control trials (RCTs)) were included. Mask wearing reduced primary infection by 6% (odds ratio (OR): 0.94; 95% CI: 0.75-1.19 for RCTs) to 61% (OR: 0.85; 95% CI: 0.32-2.27; OR: 0.39; 95% CI: 0.18-0.84 and OR: 0.61; 95% CI: 0.45-0.85 for cohort, case-control and cross-sectional studies respectively). RCTs suggested lowest secondary attack rates when both well and ill household members wore masks (OR: 0.81; 95% CI: 0.48-1.37). While RCTs might underestimate effects due to poor compliance and controls wearing masks, observational studies likely overestimate effects, as mask wearing might be associated with other risk-averse behaviours. GRADE was low or very low quality.ConclusionWearing face masks may reduce primary respiratory infection risk, probably by 6-15%. It is important to balance evidence from RCTs and observational studies when their conclusions widely differ and both are at risk of significant bias. COVID-19-specific studies are required.


Subject(s)
COVID-19/prevention & control , Eye Protective Devices , Influenza, Human/prevention & control , Masks , Picornaviridae Infections/prevention & control , Respiratory Tract Infections/prevention & control , Tuberculosis/prevention & control , COVID-19/transmission , Coronavirus Infections/prevention & control , Coronavirus Infections/transmission , Humans , Influenza, Human/transmission , Picornaviridae Infections/transmission , Respiratory Protective Devices , Respiratory Tract Infections/transmission , SARS-CoV-2 , Tuberculosis/transmission
8.
Sci Total Environ ; 683: 240-248, 2019 Sep 15.
Article in English | MEDLINE | ID: mdl-31132703

ABSTRACT

Common ragweed is a highly allergenic invasive species in Europe, expected to become widespread under climate change. Allergy to ragweed manifests as eye, nasal and lung symptoms, and children may retain these throughout life. The dose-response relationship between symptoms and pollen concentrations is unclear. We undertook a longitudinal study, assessing the association between ragweed pollen concentration and allergic eye, nasal and lung symptoms in children living under a range of ragweed pollen concentrations in Croatia. Over three years, 85 children completed daily diaries, detailing allergic symptoms alongside daily location, activities and medication, resulting in 10,130 individual daily entries. The daily ragweed pollen concentration for the children's locations was obtained, alongside daily weather and air pollution. Parents completed a home/lifestyle/medical questionnaire. Generalised Additive Mixed Models established the relationship between pollen concentrations and symptoms, alongside other covariates. Eye symptoms were associated with mean daily pollen concentration over four days (day of symptoms plus 3 previous days); 61 grains/m3/day (95%CI: 45, 100) was the threshold at which 50% of children reported symptoms. Nasal symptoms were associated with mean daily pollen concentration over 12 days (day of symptoms plus 11 previous days); the threshold for 50% of children reporting symptoms was 40 grains/m3/day (95%CI: 24, 87). Lung symptoms showed a relationship with mean daily pollen concentration over 19 days (day of symptoms plus 18 previous days), with a threshold of 71 grains/m3/day (95%CI: 59, 88). Taking medication on the day of symptoms showed higher odds, suggesting responsive behaviour. Taking medication on the day prior to symptoms showed lower odds of reporting, indicating preventative behaviour. Different symptoms in children demonstrate varying dose-response relationships with ragweed pollen concentrations. Each symptom type responded to pollen exposure over different time periods. Using medication prior to symptoms can reduce symptom presence. These findings can be used to better manage paediatric ragweed allergy symptoms.


Subject(s)
Allergens/adverse effects , Antigens, Plant/adverse effects , Plant Extracts/adverse effects , Rhinitis, Allergic, Seasonal/immunology , Allergens/analysis , Ambrosia/physiology , Antigens, Plant/analysis , Child , Child, Preschool , Croatia , Female , Humans , Longitudinal Studies , Male , Plant Extracts/analysis , Rhinitis, Allergic, Seasonal/etiology
9.
J Transl Med ; 17(1): 34, 2019 01 21.
Article in English | MEDLINE | ID: mdl-30665426

ABSTRACT

BACKGROUND: With over 800 million cases globally, campylobacteriosis is a major cause of food borne disease. In temperate climates incidence is highly seasonal but the underlying mechanisms are poorly understood, making human disease control difficult. We hypothesised that observed disease patterns reflect complex interactions between weather, patterns of human risk behaviour, immune status and level of food contamination. Only by understanding these can we find effective interventions. METHODS: We analysed trends in human Campylobacter cases in NE England from 2004 to 2009, investigating the associations between different risk factors and disease using time-series models. We then developed an individual-based (IB) model of risk behaviour, human immunological responses to infection and environmental contamination driven by weather and land use. We parameterised the IB model for NE England and compared outputs to observed numbers of reported cases each month in the population in 2004-2009. Finally, we used it to investigate different community level disease reduction strategies. RESULTS: Risk behaviours like countryside visits (t = 3.665, P < 0.001 and t = - 2.187, P = 0.029 for temperature and rainfall respectively), and consumption of barbecued food were strongly associated with weather, (t = 3.219, P = 0.002 and t = 2.015, P = 0.045 for weekly average temperature and average maximum temperature respectively) and also rain (t = 2.254, P = 0.02527). This suggests that the effect of weather was indirect, acting through changes in risk behaviour. The seasonal pattern of cases predicted by the IB model was significantly related to observed patterns (r = 0.72, P < 0.001) indicating that simulating risk behaviour could produce the observed seasonal patterns of cases. A vaccination strategy providing short-term immunity was more effective than educational interventions to modify human risk behaviour. Extending immunity to 1 year from 20 days reduced disease burden by an order of magnitude (from 2412-2414 to 203-309 cases per 50,000 person-years). CONCLUSIONS: This is the first interdisciplinary study to integrate environment, risk behaviour, socio-demographics and immunology to model Campylobacter infection, including pathways to mitigation. We conclude that vaccination is likely to be the best route for intervening against campylobacteriosis despite the technical problems associated with understanding both the underlying human immunology and genetic variation in the pathogen, and the likely cost of vaccine development.


Subject(s)
Behavior , Campylobacter Infections/epidemiology , Climate , Cost of Illness , Environment , Models, Biological , Seasons , Animals , Chickens , England/epidemiology , Humans , Rain , Temperature
11.
Article in English | MEDLINE | ID: mdl-29949854

ABSTRACT

Ragweed allergy is a major public health concern. Within Europe, ragweed is an introduced species and research has indicated that the amounts of ragweed pollen are likely to increase over Europe due to climate change, with corresponding increases in ragweed allergy. To address this threat, improving our understanding of predisposing factors for allergic sensitisation to ragweed and disease is necessary, specifically focusing upon factors that are potentially modifiable (i.e., environmental). In this study, a total of 4013 children aged 2⁻13 years were recruited across Croatia to undergo skin prick tests to determine sensitisation to ragweed and other aeroallergens. A parental questionnaire collected home environment, lifestyle, family and personal medical history, and socioeconomic information. Environmental variables were obtained using Geographical Information Systems and data from nearby pollen, weather, and air pollution stations. Logistic regression was performed (clustered on school) focusing on risk factors for allergic sensitisation and disease. Ragweed sensitisation was strongly associated with ragweed pollen at levels over 5000 grains m⁻3 year−1 and, above these levels, the risk of sensitisation was 12⁻16 times greater than in low pollen areas with about 400 grains m⁻3 year−1. Genetic factors were strongly associated with sensitisation but nearly all potentially modifiable factors were insignificant. This included measures of local land use and proximity to potential sources of ragweed pollen. Rural residence was protective (odds ratio (OR) 0.73, 95% confidence interval (CI) 0.55⁻0.98), but the factors underlying this association were unclear. Being sensitised to ragweed doubled (OR 2.17, 95% CI 1.59⁻2.96) the risk of rhinoconjunctivitis. No other potentially modifiable risk factors were associated with rhinoconjunctivitis. Ragweed sensitisation was strongly associated with ragweed pollen, and sensitisation was significantly associated with rhinoconjunctivitis. Apart from ragweed pollen levels, few other potentially modifiable factors were significantly associated with ragweed sensitisation. Hence, strategies to lower the risk of sensitisation should focus upon ragweed control.


Subject(s)
Ambrosia/immunology , Antigens, Plant/immunology , Hypersensitivity/epidemiology , Plant Extracts/immunology , Adolescent , Air Pollution , Allergens/adverse effects , Antigens, Plant/toxicity , Case-Control Studies , Child , Child, Preschool , Climate Change , Croatia/epidemiology , Female , Humans , Hypersensitivity/etiology , Male , Odds Ratio , Plant Extracts/toxicity , Pollen/immunology , Risk Factors , Skin Tests , Weather
12.
Appl Environ Microbiol ; 83(14)2017 07 15.
Article in English | MEDLINE | ID: mdl-28500040

ABSTRACT

This paper introduces a novel method for sampling pathogens in natural environments. It uses fabric boot socks worn over walkers' shoes to allow the collection of composite samples over large areas. Wide-area sampling is better suited to studies focusing on human exposure to pathogens (e.g., recreational walking). This sampling method is implemented using a citizen science approach: groups of three walkers wearing boot socks undertook one of six routes, 40 times over 16 months in the North West (NW) and East Anglian (EA) regions of England. To validate this methodology, we report the successful implementation of this citizen science approach, the observation that Campylobacter bacteria were detected on 47% of boot socks, and the observation that multiple boot socks from individual walks produced consistent results. The findings indicate higher Campylobacter levels in the livestock-dominated NW than in EA (55.8% versus 38.6%). Seasonal differences in the presence of Campylobacter bacteria were found between the regions, with indications of winter peaks in both regions but a spring peak in the NW. The presence of Campylobacter bacteria on boot socks was negatively associated with ambient temperature (P = 0.011) and positively associated with precipitation (P < 0.001), results consistent with our understanding of Campylobacter survival and the probability of material adhering to boot socks. Campylobacter jejuni was the predominant species found; Campylobacter coli was largely restricted to the livestock-dominated NW. Source attribution analysis indicated that the potential source of C. jejuni was predominantly sheep in the NW and wild birds in EA but did not differ between peak and nonpeak periods of human incidence.IMPORTANCE There is debate in the literature on the pathways through which pathogens are transferred from the environment to humans. We report on the success of a novel method for sampling human-pathogen interactions using boot socks and citizen science techniques, which enable us to sample human-pathogen interactions that may occur through visits to natural environments. This contrasts with traditional environmental sampling, which is based on spot sampling techniques and does not sample human-pathogen interactions. Our methods are of practical value to scientists trying to understand the transmission of pathogens from the environment to people. Our findings provide insight into the risk of Campylobacter exposure from recreational visits and an understanding of seasonal differences in risk and the factors behind these patterns. We highlight the Campylobacter species predominantly encountered and the potential sources of C. jejuni.


Subject(s)
Campylobacter Infections/microbiology , Campylobacter Infections/veterinary , Campylobacter/isolation & purification , Livestock/microbiology , Microbiological Techniques/methods , Animals , Animals, Wild/microbiology , Campylobacter/classification , Campylobacter/genetics , Campylobacter/physiology , England , Environment , Humans , Microbiological Techniques/instrumentation , Seasons , Shoes
13.
Environ Health Perspect ; 125(3): 385-391, 2017 03.
Article in English | MEDLINE | ID: mdl-27557093

ABSTRACT

BACKGROUND: Globally, pollen allergy is a major public health problem, but a fundamental unknown is the likely impact of climate change. To our knowledge, this is the first study to quantify the consequences of climate change upon pollen allergy in humans. OBJECTIVES: We produced quantitative estimates of the potential impact of climate change upon pollen allergy in humans, focusing upon common ragweed (Ambrosia artemisiifolia) in Europe. METHODS: A process-based model estimated the change in ragweed's range under climate change. A second model simulated current and future ragweed pollen levels. These findings were translated into health burdens using a dose-response curve generated from a systematic review and from current and future population data. Models considered two different suites of regional climate/pollen models, two greenhouse gas emissions scenarios [Representative Concentration Pathways (RCPs) 4.5 and 8.5], and three different plant invasion scenarios. RESULTS: Our primary estimates indicated that sensitization to ragweed will more than double in Europe, from 33 to 77 million people, by 2041-2060. According to our projections, sensitization will increase in countries with an existing ragweed problem (e.g., Hungary, the Balkans), but the greatest proportional increases will occur where sensitization is uncommon (e.g., Germany, Poland, France). Higher pollen concentrations and a longer pollen season may also increase the severity of symptoms. Our model projections were driven predominantly by changes in climate (66%) but were also influenced by current trends in the spread of this invasive plant species. Assumptions about the rate at which ragweed spreads throughout Europe had a large influence upon the results. CONCLUSIONS: Our quantitative estimates indicate that ragweed pollen allergy will become a common health problem across Europe, expanding into areas where it is currently uncommon. Control of ragweed spread may be an important adaptation strategy in response to climate change. Citation: Lake IR, Jones NR, Agnew M, Goodess CM, Giorgi F, Hamaoui-Laguel L, Semenov MA, Solomon F, Storkey J, Vautard R, Epstein MM. 2017. Climate change and future pollen allergy in Europe. Environ Health Perspect 125:385-391; http://dx.doi.org/10.1289/EHP173.


Subject(s)
Allergens/analysis , Climate Change/statistics & numerical data , Environmental Exposure/statistics & numerical data , Pollen , Rhinitis, Allergic, Seasonal/epidemiology , Europe/epidemiology , Hypersensitivity
14.
J Aging Phys Act ; 24(4): 599-616, 2016 10.
Article in English | MEDLINE | ID: mdl-27049356

ABSTRACT

We examine the relative importance of both objective and perceived environmental features for physical activity in older English adults. Self-reported physical activity levels of 8,281 older adults were used to compute volumes of outdoor recreational and commuting activity. Perceptions of neighborhood environment supportiveness were drawn from a questionnaire survey and a geographical information system was used to derive objective measures. Negative binominal regression models were fitted to examine associations. Perceptions of neighborhood environment were more associated with outdoor recreational activity (over 10% change per standard deviation) than objective measures (5-8% change). Commuting activity was associated with several objective measures (up to 16% change). We identified different environmental determinants of recreational and commuting activity in older adults. Perceptions of environmental supportiveness for recreational activity appear more important than actual neighborhood characteristics. Understanding how older people perceive neighborhoods might be key to encouraging outdoor recreational activity.


Subject(s)
Environment , Exercise/physiology , Aged , England , Female , Geographic Information Systems , Humans , Male , Recreation , Self Report , Surveys and Questionnaires
15.
Health Place ; 24: 90-6, 2013 Nov.
Article in English | MEDLINE | ID: mdl-24071654

ABSTRACT

The health of rural and urban populations differs, with rural areas appearing healthier. However, it is unknown whether the benefit of living in rural areas is felt by individuals in all levels of deprivation, or whether some suffer a disadvantage of rural residence. For England and Wales 2001-2003 premature mortality rates were calculated, subdivided by individual deprivation and gender, for areas with differing rurality characteristics. Premature mortality data (age 50-retirement) and a measure of the individual's deprivation (National Statistics Socio-economic Classification 1-7) was obtained from death certificates. Overall premature mortality was examined as well as premature mortality subdivided by major cause. Male premature mortality rates (age 50-64) fell with increasing rurality for individuals in all socio-economic status classifications. The most deprived individuals benefitted most from residence in increasingly rural areas. Similar trends were observed when premature mortality was subdivided by the major causes of death. Female premature mortality rates (age 50-59) demonstrated similar trends but the differences between urban and rural areas were less marked.


Subject(s)
Mortality, Premature/trends , Rural Population , Social Class , Cause of Death , Death Certificates , England/epidemiology , Female , Health Services Accessibility/statistics & numerical data , Humans , Male , Middle Aged , Wales/epidemiology
16.
Soc Sci Med ; 74(12): 1929-38, 2012 Jun.
Article in English | MEDLINE | ID: mdl-22465380

ABSTRACT

Car use is associated with substantial health and environmental costs but research in deprived populations indicates that car access may also promote psychosocial well-being within car-oriented environments. This mixed-method (quantitative and qualitative) study examined this issue in a more affluent setting, investigating the socio-economic structure of car commuting in Cambridge, UK. Our analyses involved integrating self-reported questionnaire data from 1142 participants in the Commuting and Health in Cambridge study (collected in 2009) and in-depth interviews with 50 participants (collected 2009-2010). Even in Britain's leading 'cycling city', cars were a key resource in bridging the gap between individuals' desires and their circumstances. This applied both to long-term life goals such as home ownership and to shorter-term challenges such as illness. Yet car commuting was also subject to constraints, with rush hour traffic pushing drivers to start work earlier and with restrictions on, or charges for, workplace parking pushing drivers towards multimodal journeys (e.g. driving to a 'park-and-ride' site then walking). These patterns of car commuting were socio-economically structured in several ways. First, the gradient of housing costs made living near Cambridge more expensive, affecting who could 'afford' to cycle and perhaps making cycling the more salient local marker of Bourdieu's class distinction. Nevertheless, cars were generally affordable in this relatively affluent, highly-educated population, reducing the barrier which distance posed to labour-force participation. Finally, having the option of starting work early required flexible hours, a form of job control which in Britain is more common among higher occupational classes. Following a social model of disability, we conclude that socio-economic advantage can make car-oriented environments less disabling via both greater affluence and greater job control, and in ways manifested across the full socio-economic range. This suggests the importance of combining individual-level 'healthy travel' interventions with measures aimed at creating travel environments in which all social groups can pursue healthy and satisfying lives.


Subject(s)
Automobiles , Mental Health , Transportation , Adolescent , Adult , Aged , Automobiles/economics , Female , Humans , Male , Middle Aged , Qualitative Research , Socioeconomic Factors , Surveys and Questionnaires , Time Factors , Transportation/economics , United Kingdom , Young Adult
17.
Int J Behav Nutr Phys Act ; 9: 153, 2012 Dec 31.
Article in English | MEDLINE | ID: mdl-23276280

ABSTRACT

BACKGROUND: Activity levels are known to decline with age and there is growing evidence of associations between the school environment and physical activity. In this study we investigated how objectively measured one-year changes in physical activity may be associated with school-related factors in 9- to 10-year-old British children. METHODS: Data were analysed from 839 children attending 89 schools in the SPEEDY (Sport, Physical Activity, and Eating behaviours: Environmental Determinants in Young People) study. Outcomes variables were one year changes in objectively measured sedentary, moderate, and vigorous physical activity, with baseline measures taken when the children were 9-10 years old. School characteristics hypothesised to be associated with change in physical activity were identified from questionnaires, grounds audits, and computer mapping. Associations were examined using simple and multivariable multilevel regression models for both school (9 am - 3 pm) and travel (8-9 am and 3-4 pm) time. RESULTS: Significant associations during school time included the length of the morning break which was found to be supportive of moderate (ß coefficient: 0.68 [p: 0.003]) and vigorous (ß coefficient: 0.52 [p: 0.002]) activities and helps to prevent adverse changes in sedentary time (ß coefficient: -2.52 [p: 0.001]). During travel time, positive associations were found between the presence of safe places to cross roads around the school and changes in moderate (ß coefficient: 0.83 [p:0.022]) and vigorous (ß coefficient: 0.56 [p:0.001]) activity, as well as sedentary time (ß coefficient: -1.61 [p:0.005]). CONCLUSION: This study suggests that having longer morning school breaks and providing road safety features such as cycling infrastructure, a crossing guard, and safe places for children to cross the road may have a role to play in supporting the maintenance of moderate and vigorous activity behaviours, and preventing the development of sedentary behaviours in children.


Subject(s)
Bicycling , Schools , Walking , Child , Child Behavior , England , Environment , Ethnicity , Follow-Up Studies , Humans , Linear Models , Longitudinal Studies , Multilevel Analysis , Safety , Sedentary Behavior , Sports , Surveys and Questionnaires
18.
Int J Pediatr Obes ; 6(2-2): e574-81, 2011 Jun.
Article in English | MEDLINE | ID: mdl-20854106

ABSTRACT

PURPOSE: Little is known about school environmental factors that promote or inhibit activity, especially from studies using objective measures in large representative samples. We therefore aimed to study associations between activity intensities and physical and social school environmental factors. METHODS: A population-based sample of 1 908 British children (SPEEDY study), mean age 10.3 years (Standard deviation [SD]: 0.3), recruited from 92 schools across Norfolk, UK, with valid activity data (assessed with Actigraph accelerometers). Outcome measures were school-based (8 am-4 pm on weekdays) time (in minutes) spent in sedentary (<100 counts/min), moderate (2 000-3 999 counts/min) and vigorous (≥4 000 counts/min) activity. A total of 40 school physical and social environmental factors were assessed. Multivariable multilevel linear regression analyses adjusted for children's sex and body mass index were conducted; interactions with sex were investigated. RESULTS: Availability of a 'Park and Stride' scheme was negatively associated with sedentary minutes (-7.74; 95% CI: -14.8; -0.70). Minutes of moderate activity were associated with the availability of a lollypop person (1.33, 95% CI: 0.35; 2.62) and objectively-assessed walking provision (1.70, 95% CI: 0.85; 2.56). The number of sports facilities of at least medium quality (0.47, 95% CI: 0.16; 0.79), not having a policy on physical activity (-2.28, 95% CI: -3.62; -0.95), and, in boys only, provision of pedestrian training (1.89; 95% CI: 0.77; 3.01) were associated with minutes of vigorous activity. CONCLUSIONS: Only a small number of school-level factors were associated with children's objectively-measured physical activity intensity, giving few pointers for potential future intervention efforts. Further research should focus on using objective measures to elucidate what factors may explain the school-level variance in activity levels.


Subject(s)
Motor Activity , Schools , Sedentary Behavior , Social Environment , Actigraphy/instrumentation , Age Factors , Child , Child Behavior , England , Environment Design , Facility Design and Construction , Female , Health Behavior , Health Promotion , Humans , Linear Models , Male , Physical Education and Training , Risk Assessment , Risk Factors , Sex Factors , Surveys and Questionnaires , Time Factors , Transportation , Walking
19.
Health Place ; 16(5): 776-83, 2010 Sep.
Article in English | MEDLINE | ID: mdl-20435506

ABSTRACT

The aim of this study was to develop, test, and employ an audit tool to objectively assess the opportunities for physical activity within school environments. A 44 item tool was developed and tested at 92 primary schools in the county of Norfolk, England, during summer term of 2007. Scores from the tool covering 6 domains of facility provision were examined against objectively measured hourly moderate to vigorous physical activity levels in 1868 9-10 year old pupils attending the schools. The tool was found to have acceptable reliability and good construct validity, differentiating the physical activity levels of children attending the highest and lowest scoring schools. The characteristics of school grounds may influence pupil's physical activity levels.


Subject(s)
Environment Design/statistics & numerical data , Motor Activity , Schools/statistics & numerical data , Child , England , Environment Design/standards , Humans , Schools/standards , Sports
SELECTION OF CITATIONS
SEARCH DETAIL
...