Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
1.
J Allergy Clin Immunol Pract ; 9(11): 4075-4086.e5, 2021 11.
Article in English | MEDLINE | ID: mdl-34293502

ABSTRACT

BACKGROUND: There is no current consensus on assigning severity to food-induced allergic reactions, for example, to assess the efficacy of allergen immunotherapy. Existing severity scores lack the capability to discriminate between non-anaphylaxis reactions of different severities. Attempts are ongoing to develop a more discriminatory score, which should ideally be data-driven and validated in multiple cohorts. OBJECTIVE: To undertake an exercise using best-worst scaling (BWS) to define a potential gold standard against which severity scoring of food-induced allergic reactions can be refined. METHODS: We undertook a global survey to better understand how health care professionals rate the severity of food-induced allergic reactions, using BWS methodology. Respondents were given a number of patient case vignettes describing real-world allergic reactions and asked to select the pair that, in their opinion, reflected the maximum difference in severity. Responses were then modeled and a preference score (representing severity) determined for each scenario. Scenarios were also scored using existing published scoring systems and the scores compared with the BWS score using Spearman r correlation and Cohen kappa. Given the differences in definitions of anaphylaxis globally, we also evaluated differences in BWS ranking depending on the geographical location of respondents. RESULTS: We received 334 complete responses, 183 (55%) from Europe and 65 (20%) from North America. Perception of severity of some reactions appeared to be affected by geographical location. The comparison of BWS ranking with current grading systems identified significant issues that varied from one grading system to another, such as prominence to some symptoms (eg, vomiting) that skew grading when using scoring systems not designed for food allergy. In general, current scoring systems poorly discriminate against more mild symptoms and often overestimate their severity. CONCLUSIONS: These data provide a methodology free of user scale bias to help define a potential, consensus-driven gold standard that can be used to guide and validate the development of improved grading systems to score food-induced allergic symptoms and highlight areas for education where there is the potential to miscategorize severity.


Subject(s)
Anaphylaxis , Food Hypersensitivity , Allergens , Anaphylaxis/diagnosis , Desensitization, Immunologic , Food , Food Hypersensitivity/diagnosis , Humans
2.
J Allergy Clin Immunol Pract ; 7(8): 2770-2774.e3, 2019.
Article in English | MEDLINE | ID: mdl-31078761

ABSTRACT

BACKGROUND: Lip dose challenges (LDCs) are often performed as an initial step before oral food challenges (OFCs). However, guidance on how to perform and interpret LDCs is unclear, and data are lacking regarding the diagnostic accuracy of LDCs. OBJECTIVE: To investigate current practice with respect to LDCs among UK allergy health care professionals, and to evaluate the diagnostic utility of LDCs in children undergoing OFCs for IgE-mediated food allergy. METHODS: We used an electronic survey to assess the use of LDCs by UK Allergy clinics. Separately, we prospectively recruited children undergoing "low-risk" OFCs for suspected IgE-mediated food allergy from 2 large specialist allergy units in London. LDC was performed 30 minutes before the OFC, by applying the food to the inner lip for 30 seconds. Objective symptoms were considered a positive outcome. All patients subsequently proceeded to OFC regardless of LDC outcome, and outcome assessed according to PRACTALL consensus. RESULTS: We received 147 responses to the online survey, representing 67% of registered pediatric allergy clinics in the United Kingdom. Eighty percent of respondents (representing 81% of responding centers) included LDC as the first step of OFC in routine clinical practice. There was a wide variation in both how LDCs were performed and interpreted, with one-third not proceeding to OFC if LDC resulted in subjective symptoms. In the prospective study, 198 children (mean age, 7 years) with conclusive OFCs were included. Foods tested were tree nuts (30%), peanut (16.6%), egg (16%), fish (10.5%), milk (6%), shrimp (4%), and other (16.9%). There were 12 positive LDCs (1 of which triggered systemic symptoms: generalized urticaria) and 31 positive OFCs. Two children with positive LDCs went on to have a negative diagnostic OFC. Sensitivity of LDC was 32%, specificity 98%, with a false-negative rate of 68%. CONCLUSIONS: Most UK allergy clinics included LDC as an initial step during OFC, despite a wide variation in how LDCs are performed and interpreted, which raises major concerns about the reproducibility and the validity of the test. We found that LDC had poor sensitivity as an alternative or initial step to formal OFC.


Subject(s)
Food Hypersensitivity/diagnosis , Immunologic Tests/methods , Lip , Adolescent , Attitude of Health Personnel , Child , Child, Preschool , Health Personnel , Humans , Infant , Surveys and Questionnaires , United Kingdom
3.
Pediatr Nephrol ; 29(10): 2005-11, 2014 Oct.
Article in English | MEDLINE | ID: mdl-24777534

ABSTRACT

BACKGROUND: Standard leucodepleted blood transfusions can induce the production of human leukocyte antigen (HLA)-specific antibodies, which are associated with longer transplant waiting times and poorer graft outcomes. We hypothesized that additional washing of leucodepleted red cells might reduce antigenic stimulus by removal of residual leukocytes and soluble HLA. METHODS: A retrospective review of HLA antibodies in children with chronic kidney disease stage 4-5 who had ≥ two HLA antibody screens between 2000 and 2009, pre- and post-transfusion, and were HLA antibody-negative at first testing. Patients were divided according to whether they received standard leucodepleted blood or "washed cells". To assess the efficacy of washing methods, total leukocytes were enumerated pre- and post- manual and automated washing of standard leucodepleted red cells that had been supplemented with whole blood to achieve measurable leukocyte levels pre-washing. RESULTS: A total of 106 children were included: 23 received no blood transfusions (group 1), six had washed cells only (group 2), 59 had standard transfusions only (group 3), and 18 had both standard and washed cells (group 4). Sensitization rates were 26, 17, 44, and 44 % in groups 1-4 (p = 0.32). Patients in groups 3 and 4 had more transfusions with red cells, platelets, and plasma products. There was no difference in HLA sensitization risk with washed or standard red cells on analysis of co-variance controlling for platelets and plasma transfusions. The red cell washing study showed no significant reduction in leukocytes using manual methods. Although there was a statistically significant reduction (33 %) from baseline pre-washing using the automated method, from 6.54 ± 0.84 × 10(6) to 4.36 ± 0.67 × 10(6) leukocytes per unit, the majority of leukocytes still remained. CONCLUSIONS: There was no evidence that using washed leucodepleted red cells reduced patient HLA sensitization rates. Washing leucodepleted red cells is unlikely to reduce the risk of HLA sensitization due to the limited effect on residual leukocytes.


Subject(s)
Erythrocyte Transfusion/adverse effects , Erythrocyte Transfusion/methods , HLA Antigens/immunology , Isoantibodies/blood , Leukapheresis/methods , Renal Insufficiency, Chronic/therapy , Child , Female , Humans , Male
4.
Clin Infect Dis ; 57(3): 407-14, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23645848

ABSTRACT

BACKGROUND: Noroviruses are a highly transmissible and major cause of nosocomial gastroenteritis resulting in bed and hospital-ward closures. Where hospital outbreaks are suspected, it is important to determine the routes of spread so that appropriate infection-control procedures can be implemented. To investigate a cluster of norovirus cases occurring in children undergoing bone marrow transplant, we undertook norovirus genome sequencing by next-generation methods. Detailed comparison of sequence data from 2 linked cases enabled us to identify the likely direction of spread. METHODS: Norovirus complementary DNA was amplified by overlapping polymerase chain reaction (PCR) from 13 stool samples from 5 diagnostic real-time PCR-positive patients. The amplicons were sequenced by Roche 454, the genomes assembled by de novo assembly, and the data analyzed phylogenetically. RESULTS: Phylogenetic analysis indicated that patients were infected by viruses similar to 4 distinct GII.4 subtypes and 2 patients were linked by the same virus. Of the 14 sites at which there were differences between the consensus sequences of the 2 linked viral genomes, 9 had minor variants present within one or the other patient. Further analysis confirmed that minor variants at all 9 sites in patient B w ere present as the consensus sequence in patient A. CONCLUSIONS: Phylogenetic analysis excluded a common source of infection in this apparent outbreak. Two of 3 patients on the same ward had closely related viruses, raising the possibility of cross-infection despite protective isolation. Analysis of deep sequencing data enabled us to establish the likely direction of nosocomial transmission.


Subject(s)
Caliciviridae Infections/transmission , Caliciviridae Infections/virology , Gastroenteritis/virology , Norovirus/classification , Norovirus/isolation & purification , RNA, Viral/genetics , Caliciviridae Infections/epidemiology , Child , Child, Preschool , Cluster Analysis , Cross Infection/epidemiology , Cross Infection/transmission , Cross Infection/virology , Female , Gastroenteritis/epidemiology , Genome, Viral , Genotype , High-Throughput Nucleotide Sequencing , Humans , Infant , Male , Molecular Epidemiology , Molecular Sequence Data , Norovirus/genetics , Phylogeny
SELECTION OF CITATIONS
SEARCH DETAIL
...