Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
1.
Eur J Clin Microbiol Infect Dis ; 38(9): 1709-1717, 2019 Sep.
Article in English | MEDLINE | ID: mdl-31302785

ABSTRACT

To investigate long-term health sequelae of cryptosporidiosis, with especial reference to post-infectious irritable bowel syndrome (PI-IBS). A prospective cohort study was carried out. All patients with laboratory-confirmed, genotyped cryptosporidiosis in Wales, UK, aged between 6 months and 45 years of age, over a 2-year period were contacted. Five hundred and five patients agreed to participate and were asked to complete questionnaires (paper or online) at baseline, 3 and 12 months after diagnosis. The presence/absence of IBS was established using the Rome III criteria for different age groups. Two hundred and five of 505 cases completed questionnaires (40% response rate). At 12 months, over a third of cases reported persistent abdominal pain and diarrhoea, 28% reported joint pain and 26% reported fatigue. At both 3 and 12 months, the proportion reporting fatigue and abdominal pain after Cryptosporidium hominis infection was statistically significantly greater than after C. parvum. Overall, 10% of cases had sufficient symptoms to meet IBS diagnostic criteria. A further 27% met all criteria except 6 months' duration and another 23% had several features of IBS but did not fulfil strict Rome III criteria. There was no significant difference between C. parvum and C. hominis infection with regard to PI-IBS. Post-infectious gastrointestinal dysfunction and fatigue were commonly reported after cryptosporidiosis. Fatigue and abdominal pain were significantly more common after C. hominis compared to C. parvum infection. Around 10% of people had symptoms meriting a formal diagnosis of IBS following cryptosporidiosis. Using age-specific Rome III criteria, children as well as adults were shown to be affected.


Subject(s)
Cryptosporidiosis/complications , Cryptosporidiosis/diagnosis , Irritable Bowel Syndrome/parasitology , Abdominal Pain/etiology , Adolescent , Adult , Arthralgia/etiology , Child , Child, Preschool , Cryptosporidium/genetics , Diarrhea/parasitology , Fatigue/etiology , Female , Follow-Up Studies , Genotype , Humans , Infant , Male , Middle Aged , Prospective Studies , Surveys and Questionnaires , Young Adult
3.
Arch Dis Child ; 101(6): 552-555, 2016 06.
Article in English | MEDLINE | ID: mdl-26893519

ABSTRACT

OBJECTIVE: To test the predictability of the National Health Service Institute for Innovation and Improvement (NHSIII) Paediatric Early Warning System (PEWS) score to identify children at risk of developing critical illness. DESIGN: Cohort study. SETTING: Admissions to all paediatric wards at the University Hospital of Wales between 1 December 2005 and 30 November 2006. OUTCOME MEASURES: Unscheduled paediatric high dependency unit (PHDU) admission, paediatric intensive care unit (PICU) admission and death. RESULTS: There were 9075 clinical observations from 1000 children. An NHSIII PEWS score of 2 or more, which triggers review, has a sensitivity of 73.2% (95% CI 62.2% to 82.4%), specificity of 75.2% (95% CI 74.3% to 76.1%), positive predictive value (PPV) of 2.6% (95% CI 2.0% to 3.4%), negative predictive value of 99.7% (95% CI 99.5% to 99.8%) and positive likelihood ratio of 3.0 (95% CI 2.6 to 3.4) for predicting PHDU admission, PICU admission or death. Six (37.5%) of the 16 children with an adverse outcome did not have an abnormal NHSIII PEWS score. The area under the receiver operating characteristic curve for the NHSIII PEWS score was 0.83 (95% CI 0.77 to 0.88). CONCLUSIONS: The NHSIII PEWS has a low PPV and its full implementation would result in a large number of false positive triggers. The issue with PEWS scores or triggers is neither their sensitivity nor children with high scores which require clinical interventions who are not 'false positives'; but their low specificity and low PPV arising from the large number of children with low but raised scores.


Subject(s)
Critical Illness/therapy , Adolescent , Child , Child, Preschool , Cohort Studies , Critical Care/statistics & numerical data , Diffusion of Innovation , Humans , Infant , Primary Prevention/methods , Risk Assessment/methods , Risk Factors , State Medicine/statistics & numerical data , Wales
4.
Epidemiol Infect ; 143(3): 550-60, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25600667

ABSTRACT

A large measles outbreak occurred in South Wales in 2012/2013. The outbreak has been attributed to low take-up of measles-mumps-rubella (MMR) immunization in the early 2000s. To understand better the factors that led to this outbreak we present the findings of a case-control study carried out in the outbreak area in 2001 to investigate parents' decision on whether to accept MMR. Parents who decided not to take-up MMR at the time were more likely to be older and better educated, more likely to report being influenced by newspapers [adjusted odds ratio (aOR) 3·07, 95% confidence interval (CI) 1·62-5·80], television (aOR 3·30, 95% CI 1·70-6·43), the internet (aOR 7·23, 3·26-16·06) and vaccine pressure groups (aOR 5·20, 95% CI 2·22-12·16), and less likely to be influenced by a health visitor (aOR 0·30, 95% CI 0·16-0·57). In this area of Wales, daily English-language regional newspapers, UK news programmes and the internet appeared to have a powerful negative influence. We consider the relevance of these findings to the epidemiology of the outbreak and the subsequent public health response.


Subject(s)
Attitude to Health , Communications Media , Immunization/statistics & numerical data , Measles-Mumps-Rubella Vaccine/administration & dosage , Patient Acceptance of Health Care , Adolescent , Adult , Case-Control Studies , Child, Preschool , Disease Outbreaks , Humans , Male , Measles/epidemiology , Parents , United Kingdom/epidemiology , Wales , Young Adult
5.
Arch Dis Child ; 99(1): 26-9, 2014 Jan.
Article in English | MEDLINE | ID: mdl-23995077

ABSTRACT

OBJECTIVE: To determine the use of paediatric early warning systems (PEWS) and rapid response teams (RRTs) in paediatric units in Great Britain. DESIGN: Cross sectional survey. SETTING: All hospitals with inpatient paediatric services in Great Britain. OUTCOME MEASURES: Proportion of units using PEWS, origin of PEWS used, criterion included in PEWS, proportion of units with an RRT and membership of RRT. RESULTS: The response rate was 95% (149/157). 85% of units were using PEWS and 18% had an RRT in place. Tertiary units were more likely than district general hospital to have implemented PEWS, 90% versus 83%, and an RRT, 52% versus 10%. A large number of PEWS were in use, the majority of which were unpublished and unvalidated systems. CONCLUSIONS: Despite the inconclusive evidence of effectiveness, the use of PEWS has increased since 2005. The implementation has been inconsistent with large variation in the PEWS used, the activation criteria used, availability of an RRT and the membership of the RRT. There must be a coordinated national evaluation of the implementation, impact and effectiveness of a standardised PEWS programme in the various environments where acutely sick children are managed.


Subject(s)
Clinical Alarms/statistics & numerical data , Critical Care/organization & administration , Hospital Rapid Response Team/statistics & numerical data , Hospitals/standards , Pediatrics/trends , Cross-Sectional Studies , Health Care Surveys , Hospitals/trends , Humans , Pediatrics/organization & administration , United Kingdom
6.
Arch Dis Child ; 96(2): 174-9, 2011 Feb.
Article in English | MEDLINE | ID: mdl-21030364

ABSTRACT

OBJECTIVE: To test the predictability of the Melbourne criteria for activation of the medical emergency team (MET) to identify children at risk of developing critical illness. DESIGN: Cohort study. SETTING: Admissions to all paediatric wards at the University Hospital of Wales. OUTCOME MEASURES: Paediatric high dependency unit admission, paediatric intensive care unit admission and death. RESULTS: Data were collected on 1000 patients. A single abnormal observation determined by the Melbourne Activation Criteria (MAC) had a sensitivity of 68.3% (95% CI 57.7 to 77.3), specificity 83.2% (95% CI 83.1 to 83.2), positive predictive value (PPV) 3.6% (95% CI 3.0 to 4.0) and negative predictive value 99.7% (95% CI 99.5 to 99.8) for an adverse outcome. Seven of the 16 children (43.8%) would not have transgressed the MAC prior to the adverse outcomes. Four hundred and sixty-nine of the 984 children (47.7%) who did not have an adverse outcome would have transgressed the MAC at least once during the admission. CONCLUSION: The MAC has a low PPV and its full implementation would result in a large number of false positive triggers. Further research is required to determine the relative contribution of the components of this complex intervention (Paediatric Early Warning System, education and MET) on patient outcome.


Subject(s)
Emergency Service, Hospital/organization & administration , Health Status Indicators , Hospital Rapid Response Team/organization & administration , Intensive Care Units, Pediatric/organization & administration , Adolescent , Child , Child, Preschool , Critical Illness/therapy , Epidemiologic Methods , False Positive Reactions , Humans , Infant , Infant, Newborn , Patient Admission/statistics & numerical data , Wales
7.
Epidemiol Infect ; 138(12): 1704-11, 2010 Dec.
Article in English | MEDLINE | ID: mdl-20587125

ABSTRACT

In summer 2008, we investigated an outbreak of diarrhoeal illness in participants of a mountain-bike event in Wales (UK) which had been affected by heavy rain. We conducted a retrospective cohort study to investigate the cause using an internet-based questionnaire. Fifty-three percent of those contacted responded, and 161 (46·5%) out of the 347 responders, reported gastrointestinal symptoms. Median day of onset was 3 days following the event. Ten riders reported receiving a laboratory-confirmed diagnosis of Campylobacter. Multivariate logistic regression analysis identified the inadvertent ingestion of mud (OR 2·5, 95% CI 1·5-4·2, P<0·001) and eating 'other' food during the event (OR 2·1, 95% CI 1·2-3·6, P=0·01) as significant risk factors for illness. We concluded that the outbreak was caused by Campylobacter, spread to the riders by the inadvertent ingestion of mud which had been contaminated with sheep faeces from the rural course. Mountain-bike race organizers should consider microbiological hazards when risk-assessing potential race courses. The internet is an efficient tool for the investigation of outbreaks in computer-literate populations.


Subject(s)
Athletes , Campylobacter Infections/epidemiology , Diarrhea/epidemiology , Disease Outbreaks , Internet , Telemedicine/methods , Adolescent , Adult , Animals , Campylobacter/isolation & purification , Campylobacter Infections/microbiology , Cohort Studies , Diarrhea/microbiology , Female , Humans , Male , Middle Aged , Retrospective Studies , Sheep , Soil Microbiology , Surveys and Questionnaires , Wales/epidemiology , Young Adult , Zoonoses/epidemiology , Zoonoses/microbiology
8.
J Water Health ; 8(2): 299-310, 2010 Jun.
Article in English | MEDLINE | ID: mdl-20154393

ABSTRACT

An outbreak in the autumn of 2005 resulted in 218 confirmed cases of Cryptosporidium hominis. The attack rate (relative risk 4.1, 95%CI 2.8-9.1) was significantly higher in the population supplied by Cwellyn Water Treatment Works (WTW). A case-control study demonstrated a statistically significant association (odds ratio 6.1, 95% CI 1.8-23.8) between drinking unboiled tap water and C. hominis infection. The association remained significant in a logistic regression analysis, with an adjusted odds ratio of 1.30 (95 CI 1.05-1.61) per glass of unboiled tap water consumed per day. This evidence together with environmental and associated microbiological investigations, and the absence of effective treatment to remove Cryptosporidium oocysts at the WTW, led to the conclusion that the outbreak was waterborne. Oocyst counts in final treated water at the WTW and at different points in the distribution system were consistently very low, maximum count in continuous monitoring 0.08 oocysts per 10 litres. Data from continuous monitoring and the epidemic curve is consistent with the hypothesis that low numbers of oocysts of C hominis were present in treated water continuously during the outbreak and these were of sufficient infectivity to cause illness. All surface water derived water supplies present a potential risk to human health and appropriate control measures should be in place to minimise these risks.


Subject(s)
Cryptosporidiosis/epidemiology , Cryptosporidium/isolation & purification , Disease Outbreaks , Oocysts , Case-Control Studies , Environmental Monitoring , Epidemiological Monitoring , Humans , Wales/epidemiology , Water Microbiology , Water Purification/methods , Water Supply
9.
Arch Dis Child ; 94(8): 602-6, 2009 Aug.
Article in English | MEDLINE | ID: mdl-18812403

ABSTRACT

OBJECTIVE: To develop and test the predictability of a paediatric early warning score to identify children at risk of developing critical illness. DESIGN: Prospective cohort study. SETTING: Admissions to all paediatric wards at the University Hospital of Wales. OUTCOME MEASURES: Respiratory arrest, cardiac arrest, paediatric high-dependency unit admission, paediatric intensive care unit admission and death. RESULTS: Data were collected on 1000 patients. A single abnormal observation determined by the Cardiff and Vale paediatric early warning system (C&VPEWS) had a 89.0% sensitivity (95% CI 80.5 to 94.1), 63.9% specificity (95% CI 63.8 to 63.9), 2.2% positive predictive value (95% CI 2.0 to 2.3) and a 99.8% negative predictive value (95% CI 99.7 to 99.9) for identifying children who subsequently had an adverse outcome. The area under the receiver operating characteristic curve for the C&VPEWS score was 0.86 (95% CI 0.82 to 0.91). CONCLUSION: Identifying children likely to develop critical illness can be difficult. The assessment tool developed from the advanced paediatric life support guidelines on identifying sick children appears to be sensitive but not specific. If the C&VPEWS was used as a trigger to activate a rapid response team to assess the child, the majority of calls would be unnecessary.


Subject(s)
Critical Illness , Emergency Service, Hospital/standards , Intensive Care Units, Pediatric/statistics & numerical data , Patient Transfer/statistics & numerical data , Adolescent , Child , Child, Preschool , Epidemiologic Methods , Humans , Infant , Infant, Newborn , Medical Audit/statistics & numerical data
10.
Br Dent J ; 205(4): E8; discussion 194-5, 2008 Aug 23.
Article in English | MEDLINE | ID: mdl-18650798

ABSTRACT

OBJECTIVES: To investigate the association between treatment by a dental healthcare worker (HCW) and patient infection with a blood-borne virus (BBV). DESIGN: Nested case control study. SETTING: A patient notification exercise (PNE) arising from a hepatitis C virus positive HCW that was undertaken because of deficiencies in infection control practice. METHODS: Cases were individuals with a BBV infection identified as a result of the PNE. Controls were randomly selected individuals with negative tests for BBVs. Detailed information on dental treatment was obtained from patient notes. Information on risk factors for BBV infection was obtained using a structured questionnaire administered by telephone interview. RESULTS: Thirty patients had evidence of infection with a BBV. The mean number of visits for treatment was 20.5 in cases and 18.6 in controls; the difference 1.8 (95% CI -5.4 to 9.1) was not statistically significant (p = 0.62). Transmission of hepatitis C in the dental setting was excluded by sequencing of the viral genome or establishing alternative risk factors. CONCLUSION: There was no evidence of transmission of hepatitis C virus from the HCW to patients, or transmission of a BBV from patient to patient. To ensure consistent practice within the UK the National Institute for Health and Clinical Excellence should produce guidance on PNEs for the NHS.


Subject(s)
Blood-Borne Pathogens , Dentists , Hepatitis C/transmission , Infection Control, Dental , Infectious Disease Transmission, Professional-to-Patient/statistics & numerical data , Case-Control Studies , Cross Infection/transmission , Dental Care/classification , Dental Care/statistics & numerical data , Disease Notification , Female , Genome, Viral/genetics , HIV/classification , Hepacivirus/classification , Hepacivirus/genetics , Hepatitis B virus/classification , Humans , Male , Mass Screening , Medical History Taking , Middle Aged , Risk Assessment , Time Factors , United Kingdom , Viremia/virology
11.
Epidemiol Infect ; 135(5): 798-801, 2007 Jul.
Article in English | MEDLINE | ID: mdl-17064456

ABSTRACT

In September 2002, facsimiles were sent to 360 primary-care physicians alerting them to a local outbreak of Q fever. The physicians subsequently submitted serology samples on significantly more patients than in a previously comparable period in 2001. Facsimile cascade assists effective communication with primary-care physicians in an outbreak investigation.


Subject(s)
Disease Outbreaks , Q Fever/epidemiology , Telefacsimile , Complement Fixation Tests , Humans , Physicians, Family
12.
Epidemiol Infect ; 134(6): 1167-73, 2006 Dec.
Article in English | MEDLINE | ID: mdl-16623990

ABSTRACT

A case-control study was undertaken in an acute district general hospital to identify risk factors for hospital-acquired bacteraemia caused by methicillin-resistant Staphylococcus aureus (MRSA). Cases of hospital-acquired MRSA bacteraemia were defined as consecutive patients from whom MRSA was isolated from a blood sample taken on the third or subsequent day after admission. Controls were randomly selected from patients admitted to the hospital over the same time period with a length of stay of more than 2 days who did not have bacteraemia. Data on 42 of the 46 cases of hospital-acquired bacteraemia and 90 of the 92 controls were available for analysis. There were no significant differences in the age or sex of cases and controls. After adjusting for confounding factors, insertion of a central line [adjusted odds ratio (aOR) 35.3, 95% confidence interval (CI) 3.8-325.5] or urinary catheter (aOR 37.1, 95% CI 7.1-193.2) during the admission, and surgical site infection (aOR 4.3, 95% CI 1.2-14.6) all remained independent risk factors for MRSA bacteraemia. The adjusted population attributable fraction, showed that 51% of hospital-acquired MRSA bacteraemia cases were attributable to a urinary catheter, 39% to a central line, and 16% to a surgical site infection. In the United Kingdom, measures to reduce the incidence of hospital-acquired MRSA bacteraemia in acute general hospitals should focus on improving infection control procedures for the insertion and, most importantly, care of central lines and urinary catheters.


Subject(s)
Bacteremia/epidemiology , Community-Acquired Infections/epidemiology , Cross Infection/epidemiology , Methicillin Resistance , Methicillin/pharmacology , Staphylococcus aureus/drug effects , Bacteremia/etiology , Case-Control Studies , Community-Acquired Infections/complications , Community-Acquired Infections/etiology , Community-Acquired Infections/prevention & control , Cross Infection/complications , Cross Infection/prevention & control , Humans , Risk Factors , Staphylococcal Infections/epidemiology , Staphylococcal Infections/microbiology
13.
J Antimicrob Chemother ; 55(5): 628-33, 2005 May.
Article in English | MEDLINE | ID: mdl-15772143

ABSTRACT

Progress on rational intervention to prevent increasing antibiotic resistance has been slow. We suggest that this is because the science of resistance epidemiology has received little attention, and that a systematic, co-operative investigation of this area might yield a relevant knowledge base, analogous to the basis for effective public health intervention in infectious disease given by infection epidemiology. The steps required to progress this approach in the UK are discussed, along with a summary of what is known and speculation on what might emerge.


Subject(s)
Anti-Bacterial Agents/pharmacology , Bacterial Infections/epidemiology , Drug Resistance, Bacterial , Population Surveillance/methods , Adolescent , Aged , Bacteria/drug effects , Bacterial Infections/drug therapy , Bacterial Infections/microbiology , Child , Child, Preschool , Humans , Infant , Prevalence , Research Design , United Kingdom
14.
J Antimicrob Chemother ; 53(6): 1010-7, 2004 Jun.
Article in English | MEDLINE | ID: mdl-15102750

ABSTRACT

OBJECTIVE: To investigate the effects of laboratory testing policies, particularly selective testing, rule-based reporting and isolate identification, on estimates of community antimicrobial resistance. MATERIALS AND METHODS: Antibiotic resistance estimates were analysed from an all-Wales dataset for approximately 300 000 community isolates of common pathogens. RESULTS: Selective testing policies were often associated with markedly increased resistance, particularly for second-line testing. Site-specific testing tended to yield variant resistance estimates for eye and ear isolates. Estimates from rule-based reporting deviated markedly from test-result-based reporting. Urinary isolates reported as Escherichia coli showed greater susceptibility than those reported as undifferentiated urinary 'coliforms'. The proportion of isolates tested for an antibiotic by a laboratory was a useful indicator of selective testing in this dataset. Selective testing policies had invariably been applied where the proportion of isolates of a species tested against an antibiotic was <90%. As this proportion fell with increasingly selective policies, divergence from pooled-all-Wales non-selective estimates tended to increase, with a bias to increased resistance. CONCLUSIONS: Selective testing, rule-based reporting and urinary coliform identification policies all had significant effects upon resistance estimates. Triage based upon the proportion of isolates tested seemed a useful tool in assigning analysis resources. Where <20% of isolates were tested, selective policies with inherent bias to increased resistance were common, the low number of isolates gave high potential sampling errors, and little confidence could be placed in the resistance estimate. Where 20-90% of isolates were tested, detailed analysis sometimes revealed resistance estimates that might be usefully retrieved. Where >/=90% of isolates were tested, there was no evidence of selective testing, and inter-laboratory variation in estimates appeared to be safely ascribable to other effects, e.g. methodology or real variation in resistance levels.


Subject(s)
Drug Resistance, Bacterial , Laboratories/standards , Microbial Sensitivity Tests/standards , Population Surveillance/methods , Anti-Bacterial Agents/pharmacology , Bacterial Infections/drug therapy , Bacterial Infections/epidemiology , Bacterial Infections/microbiology , Drug Prescriptions , Enterobacteriaceae , Humans , Public Policy , Urinary Tract Infections/microbiology , Wales/epidemiology
16.
Commun Dis Public Health ; 4(4): 300-4, 2001 Dec.
Article in English | MEDLINE | ID: mdl-12109399

ABSTRACT

An outbreak of Salmonella indiana infection in December 2000 affected 17 staff, relatives and patients at an acute NHS Hospital in Swansea. Epidemiological investigation identified egg mayonnaise sandwiches as the vehicle of infection. It was not possible to definitively determine the source of the infection or how the prepared sandwiches became contaminated. The most likely explanation was a pasteurisation failure of a batch of the egg roll used to make these sandwiches. Sandwiches are the most frequently identified vehicle of infection in foodborne outbreaks of salmonella infection in hospitals in England and Wales. The process of sandwich preparation has inherent risks because it involves considerable handling of food, which is consumed without further cooking. Care is required in all stages of preparation including the sourcing of materials used to produce the sandwiches. NHS Trusts should review their Hazard Analysis Critical Control Point plans for sandwich production.


Subject(s)
Disease Outbreaks , Eggs/microbiology , Food Service, Hospital/standards , Salmonella Infections/epidemiology , Case-Control Studies , Eggs/adverse effects , Hospitals, Public , Humans , Salmonella Infections/etiology , State Medicine , United Kingdom/epidemiology
18.
Commun Dis Public Health ; 3(1): 67-8, 2000 Mar.
Article in English | MEDLINE | ID: mdl-10743326

ABSTRACT

Uptake of the measles, mumps, and rubella vaccine in the United Kingdom has declined to levels that will allow outbreaks of these preventable diseases to occur. A leaflet sent with a personalized reminder did not increase vaccine uptake in children who had not been immunised at 21 months of age.


Subject(s)
Health Education , Measles Vaccine , Measles/prevention & control , Mumps Vaccine , Mumps/prevention & control , Rubella Vaccine , Rubella/prevention & control , Child, Preschool , Health Behavior , Humans , Infant , Measles-Mumps-Rubella Vaccine , Postal Service , Reminder Systems , Vaccines, Combined
SELECTION OF CITATIONS
SEARCH DETAIL
...