Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
1.
PLoS One ; 19(5): e0303132, 2024.
Article in English | MEDLINE | ID: mdl-38768224

ABSTRACT

There are few studies comparing proportion, frequency, mortality and mortality rate following antimicrobial-resistant (AMR) infections between tertiary-care hospitals (TCHs) and secondary-care hospitals (SCHs) in low and middle-income countries (LMICs) to inform intervention strategies. The aim of this study is to demonstrate the utility of an offline tool to generate AMR reports and data for a secondary data analysis. We conducted a secondary-data analysis on a retrospective, multicentre data of hospitalised patients in Thailand. Routinely collected microbiology and hospital admission data of 2012 to 2015, from 15 TCHs and 34 SCHs were analysed using the AMASS v2.0 (www.amass.website). We then compared the burden of AMR bloodstream infections (BSI) between those TCHs and SCHs. Of 19,665 patients with AMR BSI caused by pathogens under evaluation, 10,858 (55.2%) and 8,807 (44.8%) were classified as community-origin and hospital-origin BSI, respectively. The burden of AMR BSI was considerably different between TCHs and SCHs, particularly of hospital-origin AMR BSI. The frequencies of hospital-origin AMR BSI per 100,000 patient-days at risk in TCHs were about twice that in SCHs for most pathogens under evaluation (for carbapenem-resistant Acinetobacter baumannii [CRAB]: 18.6 vs. 7.0, incidence rate ratio 2.77; 95%CI 1.72-4.43, p<0.001; for carbapenem-resistant Pseudomonas aeruginosa [CRPA]: 3.8 vs. 2.0, p = 0.0073; third-generation cephalosporin resistant Escherichia coli [3GCREC]: 12.1 vs. 7.0, p<0.001; third-generation cephalosporin resistant Klebsiella pneumoniae [3GCRKP]: 12.2 vs. 5.4, p<0.001; carbapenem-resistant K. pneumoniae [CRKP]: 1.6 vs. 0.7, p = 0.045; and methicillin-resistant Staphylococcus aureus [MRSA]: 5.1 vs. 2.5, p = 0.0091). All-cause in-hospital mortality (%) following hospital-origin AMR BSI was not significantly different between TCHs and SCHs (all p>0.20). Due to the higher frequencies, all-cause in-hospital mortality rates following hospital-origin AMR BSI per 100,000 patient-days at risk were considerably higher in TCHs for most pathogens (for CRAB: 10.2 vs. 3.6,mortality rate ratio 2.77; 95%CI 1.71 to 4.48, p<0.001; CRPA: 1.6 vs. 0.8; p = 0.020; 3GCREC: 4.0 vs. 2.4, p = 0.009; 3GCRKP, 4.0 vs. 1.8, p<0.001; CRKP: 0.8 vs. 0.3, p = 0.042; and MRSA: 2.3 vs. 1.1, p = 0.023). In conclusion, the burden of AMR infections in some LMICs might differ by hospital type and size. In those countries, activities and resources for antimicrobial stewardship and infection control programs might need to be tailored based on hospital setting. The frequency and in-hospital mortality rate of hospital-origin AMR BSI are important indicators and should be routinely measured to monitor the burden of AMR in every hospital with microbiology laboratories in LMICs.


Subject(s)
Bacteremia , Tertiary Care Centers , Humans , Tertiary Care Centers/statistics & numerical data , Retrospective Studies , Thailand/epidemiology , Bacteremia/mortality , Bacteremia/drug therapy , Bacteremia/microbiology , Female , Male , Cross Infection/mortality , Cross Infection/microbiology , Cross Infection/drug therapy , Cross Infection/epidemiology , Anti-Bacterial Agents/therapeutic use , Anti-Bacterial Agents/pharmacology , Drug Resistance, Bacterial , Middle Aged , Aged , Adult , Hospital Mortality
2.
Am J Trop Med Hyg ; 2024 May 28.
Article in English | MEDLINE | ID: mdl-38806021

ABSTRACT

Information on notifiable bacterial diseases (NBD) in low- and middle-income countries (LMICs) is frequently incomplete. We developed the AutoMated tool for the Antimicrobial resistance Surveillance System plus (AMASSplus), which can support hospitals to analyze their microbiology and hospital data files automatically (in CSV or Excel format) and promptly generate antimicrobial resistance surveillance and NBD reports (in PDF and CSV formats). The NBD reports included the total number of cases and deaths after Brucella spp., Burkholderia pseudomallei, Corynebacterium diphtheriae, Neisseria gonorrhoeae, Neisseria meningitidis, nontyphoidal Salmonella spp., Salmonella enterica serovar Paratyphi, Salmonella enterica serovar Typhi, Shigella spp., Streptococcus suis, and Vibrio spp. infections. We tested the tool in six hospitals in Thailand in 2022. The total number of deaths identified by the AMASSplus was higher than those reported to the national notifiable disease surveillance system (NNDSS); particularly for B. pseudomallei infection (134 versus 2 deaths). This tool could support the NNDSS in LMICs.

3.
Nat Commun ; 14(1): 6153, 2023 10 03.
Article in English | MEDLINE | ID: mdl-37788991

ABSTRACT

Approximately 10% of antimicrobials used by humans in low- and middle-income countries are estimated to be substandard or falsified. In addition to their negative impact on morbidity and mortality, they may also be important drivers of antimicrobial resistance. Despite such concerns, our understanding of this relationship remains rudimentary. Substandard and falsified medicines have the potential to either increase or decrease levels of resistance, and here we discuss a range of mechanisms that could drive these changes. Understanding these effects and their relative importance will require an improved understanding of how different drug exposures affect the emergence and spread of resistance and of how the percentage of active pharmaceutical ingredients in substandard and falsified medicines is temporally and spatially distributed.


Subject(s)
Counterfeit Drugs , Humans , Anti-Bacterial Agents/pharmacology , Drug Resistance, Bacterial
4.
Nature ; 623(7985): 132-138, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37853126

ABSTRACT

Hospital-based transmission had a dominant role in Middle East respiratory syndrome coronavirus (MERS-CoV) and severe acute respiratory syndrome coronavirus (SARS-CoV) epidemics1,2, but large-scale studies of its role in the SARS-CoV-2 pandemic are lacking. Such transmission risks spreading the virus to the most vulnerable individuals and can have wider-scale impacts through hospital-community interactions. Using data from acute hospitals in England, we quantify within-hospital transmission, evaluate likely pathways of spread and factors associated with heightened transmission risk, and explore the wider dynamical consequences. We estimate that between June 2020 and March 2021 between 95,000 and 167,000 inpatients acquired SARS-CoV-2 in hospitals (1% to 2% of all hospital admissions in this period). Analysis of time series data provided evidence that patients who themselves acquired SARS-CoV-2 infection in hospital were the main sources of transmission to other patients. Increased transmission to inpatients was associated with hospitals having fewer single rooms and lower heated volume per bed. Moreover, we show that reducing hospital transmission could substantially enhance the efficiency of punctuated lockdown measures in suppressing community transmission. These findings reveal the previously unrecognized scale of hospital transmission, have direct implications for targeting of hospital control measures and highlight the need to design hospitals better equipped to limit the transmission of future high-consequence pathogens.


Subject(s)
COVID-19 , Cross Infection , Disease Transmission, Infectious , Inpatients , Pandemics , Humans , Communicable Disease Control , COVID-19/epidemiology , COVID-19/transmission , Cross Infection/epidemiology , Cross Infection/prevention & control , Cross Infection/transmission , Disease Transmission, Infectious/prevention & control , Disease Transmission, Infectious/statistics & numerical data , England/epidemiology , Hospitals , Pandemics/prevention & control , Pandemics/statistics & numerical data , Quarantine/statistics & numerical data , SARS-CoV-2
5.
PLoS Med ; 20(6): e1004013, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37319169

ABSTRACT

BACKGROUND: Reducing antibiotic treatment duration is a key component of hospital antibiotic stewardship interventions. However, its effectiveness in reducing antimicrobial resistance is uncertain and a clear theoretical rationale for the approach is lacking. In this study, we sought to gain a mechanistic understanding of the relation between antibiotic treatment duration and the prevalence of colonisation with antibiotic-resistant bacteria in hospitalised patients. METHODS AND FINDINGS: We constructed 3 stochastic mechanistic models that considered both between- and within-host dynamics of susceptible and resistant gram-negative bacteria, to identify circumstances under which shortening antibiotic duration would lead to reduced resistance carriage. In addition, we performed a meta-analysis of antibiotic treatment duration trials, which monitored resistant gram-negative bacteria carriage as an outcome. We searched MEDLINE and EMBASE for randomised controlled trials published from 1 January 2000 to 4 October 2022, which allocated participants to varying durations of systemic antibiotic treatments. Quality assessment was performed using the Cochrane risk-of-bias tool for randomised trials. The meta-analysis was performed using logistic regression. Duration of antibiotic treatment and time from administration of antibiotics to surveillance culture were included as independent variables. Both the mathematical modelling and meta-analysis suggested modest reductions in resistance carriage could be achieved by reducing antibiotic treatment duration. The models showed that shortening duration is most effective at reducing resistance carriage in high compared to low transmission settings. For treated individuals, shortening duration is most effective when resistant bacteria grow rapidly under antibiotic selection pressure and decline rapidly when stopping treatment. Importantly, under circumstances whereby administered antibiotics can suppress colonising bacteria, shortening antibiotic treatment may increase the carriage of a particular resistance phenotype. We identified 206 randomised trials, which investigated antibiotic duration. Of these, 5 reported resistant gram-negative bacteria carriage as an outcome and were included in the meta-analysis. The meta-analysis determined that a single additional antibiotic treatment day is associated with a 7% absolute increase in risk of resistance carriage (80% credible interval 3% to 11%). Interpretation of these estimates is limited by the low number of antibiotic duration trials that monitored carriage of resistant gram-negative bacteria, as an outcome, contributing to a large credible interval. CONCLUSIONS: In this study, we found both theoretical and empirical evidence that reducing antibiotic treatment duration can reduce resistance carriage, though the mechanistic models also highlighted circumstances under which reducing treatment duration can, perversely, increase resistance. Future antibiotic duration trials should monitor antibiotic-resistant bacteria colonisation as an outcome to better inform antibiotic stewardship policies.


Subject(s)
Anti-Bacterial Agents , Duration of Therapy , Humans , Anti-Bacterial Agents/adverse effects , Drug Resistance, Bacterial
6.
Open Forum Infect Dis ; 9(9): ofac305, 2022 Sep.
Article in English | MEDLINE | ID: mdl-36092827

ABSTRACT

Background: Quantifying the excess mortality attributable to antimicrobial-resistant (AMR) bacterial infections is important for assessing the potential benefit of preventive interventions and for prioritization of resources. However, there are few data from low- and middle-income countries. Methods: We conducted a 2-year prospective surveillance study to estimate the excess mortality attributable to AMR infections for all types of hospital-acquired infection (HAI), and included bacterial species that were both locally relevant and included in the World Health Organization priority list. Twenty-eight-day mortality was measured. Excess mortality and population attributable fraction (PAF) of mortality caused by AMR infections compared to antimicrobial-susceptible (AMS) infections, adjusted for predefined confounders, were calculated. Results: We enrolled 2043 patients with HAIs. The crude 28-day mortality of patients with AMR and AMS infections was 35.5% (491/1385) and 23.1% (152/658), respectively. After adjusting for prespecified confounders, the estimated excess mortality attributable to AMR infections was 7.7 (95% confidence interval [CI], 2.2-13.2) percentage points. This suggests that 106 (95% CI, 30-182) deaths among 1385 patients with AMR infections might have been prevented if all of the AMR infections in this study were AMS infections. The overall PAF was 16.3% (95% CI, 1.2%-29.1%). Among the bacteria under evaluation, carbapenem-resistant Acinetobacter baumannii was responsible for the largest number of excess deaths. Among all types of infection, urinary tract infections were associated with the highest number of excess deaths, followed by lower respiratory tract infections and bloodstream infections. Conclusions: Estimating and monitoring excess mortality attributable to AMR infections should be included in national action plans to prioritize targets of preventive interventions. Clinical Trials Registration: NCT03411538.

7.
Epidemiol Infect ; 150: e79, 2022 03 21.
Article in English | MEDLINE | ID: mdl-35445655

ABSTRACT

Hand hygiene is a simple, low-cost intervention that may lead to substantial population-level effects in suppressing acute respiratory infection epidemics. However, quantification of the efficacy of hand hygiene on respiratory infection in the community is lacking. We searched PubMed for randomised controlled trials on the effect of hand hygiene for reducing acute respiratory infections in the community published before 11 March 2021. We performed a meta-regression analysis using a Bayesian mixed-effects model. A total of 105 publications were identified, out of which six studies reported hand hygiene frequencies. Four studies were performed in household settings and two were in schools. The average number of handwashing events per day ranged from one to eight in the control arms, and four to 17 in the intervention arms. We estimated that a single hand hygiene event is associated with a 3% (80% credible interval (-1% to 7%)) decrease in the daily probability of an acute respiratory infection. Three of these six studies were potentially at high risk of bias because the primary outcome depended on self-reporting of upper respiratory tract symptoms. Well-designed trials with an emphasis on monitoring hand hygiene adherence are needed to confirm these findings.


Subject(s)
Epidemics , Hand Hygiene , Respiratory Tract Infections , Bayes Theorem , Hand Disinfection , Humans , Respiratory Tract Infections/epidemiology , Respiratory Tract Infections/prevention & control
8.
Infect Prev Pract ; 4(1): 100192, 2022 Mar.
Article in English | MEDLINE | ID: mdl-34870142

ABSTRACT

Many infection prevention and control (IPC) interventions have been adopted by hospitals to limit nosocomial transmission of SARS-CoV-2. The aim of this systematic review is to identify evidence on the effectiveness of these interventions. We conducted a literature search of five databases (OVID MEDLINE, Embase, CENTRAL, COVID-19 Portfolio (pre-print), Web of Science). SWIFT ActiveScreener software was used to screen English titles and abstracts published between 1st January 2020 and 6th April 2021. Intervention studies, defined by Cochrane Effective Practice and Organisation of Care, that evaluated IPC interventions with an outcome of SARS-CoV-2 infection in either patients or healthcare workers were included. Personal protective equipment (PPE) was excluded as this intervention had been previously reviewed. Risks of bias were assessed using the Cochrane tool for randomised trials (RoB2) and non-randomized studies of interventions (ROBINS-I). From 23,156 screened articles, we identified seven articles that met the inclusion criteria, all of which evaluated interventions to prevent infections in healthcare workers and the majority of which were focused on effectiveness of prophylaxes. Due to heterogeneity in interventions, we did not conduct a meta-analysis. All agents used for prophylaxes have little to no evidence of effectiveness against SARS-CoV-2 infections. We did not find any studies evaluating the effectiveness of interventions including but not limited to screening, isolation and improved ventilation. There is limited evidence from interventional studies, excluding PPE, evaluating IPC measures for SARS-CoV-2. This review calls for urgent action to implement such studies to inform policies to protect our most vulnerable populations and healthcare workers.

9.
Clin Microbiol Infect ; 27(10): 1391-1399, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34111583

ABSTRACT

BACKGROUND: Routine microbiology results are a valuable source of antimicrobial resistance (AMR) surveillance data in low- and middle-income countries (LMICs) as well as in high-income countries. Different approaches and strategies are used to generate AMR surveillance data. OBJECTIVES: We aimed to review strategies for AMR surveillance using routine microbiology results in LMICs and to highlight areas that need support to generate high-quality AMR data. SOURCES: We searched PubMed for papers that used routine microbiology to describe the epidemiology of AMR and drug-resistant infections in LMICs. We also included papers that, from our perspective, were critical in highlighting the biases and challenges or employed specific strategies to overcome these in reporting AMR surveillance in LMICs. CONTENT: Topics covered included strategies of identifying AMR cases (including case-finding based on isolates from routine diagnostic specimens and case-based surveillance of clinical syndromes), of collecting data (including cohort, point-prevalence survey, and case-control), of sampling AMR cases (including lot quality assurance surveys), and of processing and analysing data for AMR surveillance in LMICs. IMPLICATIONS: The various AMR surveillance strategies warrant a thorough understanding of their limitations and potential biases to ensure maximum utilization and interpretation of local routine microbiology data across time and space. For instance, surveillance using case-finding based on results from clinical diagnostic specimens is relatively easy to implement and sustain in LMIC settings, but the estimates of incidence and proportion of AMR is at risk of biases due to underuse of microbiology. Case-based surveillance of clinical syndromes generates informative statistics that can be translated to clinical practices but needs financial and technical support as well as locally tailored trainings to sustain. Innovative AMR surveillance strategies that can easily be implemented and sustained with minimal costs will be useful for improving AMR data availability and quality in LMICs.


Subject(s)
Anti-Bacterial Agents , Drug Resistance, Bacterial , Epidemiological Monitoring , Anti-Bacterial Agents/pharmacology , Developing Countries , Humans , Lot Quality Assurance Sampling
10.
Am J Epidemiol ; 190(11): 2395-2404, 2021 11 02.
Article in English | MEDLINE | ID: mdl-34048554

ABSTRACT

Delays in treating bacteremias with antibiotics to which the causative organism is susceptible are expected to adversely affect patient outcomes. Quantifying the impact of such delays to concordant treatment is important for decision-making about interventions to reduce the delays and for quantifying the burden of disease due to antimicrobial resistance. There are, however, potentially important biases to be addressed, including immortal time bias. We aimed to estimate the impact of delays in appropriate antibiotic treatment of patients with Acinetobacter species hospital-acquired bacteremia in Thailand on 30-day mortality by emulating a target trial using retrospective cohort data from Sunpasitthiprasong Hospital in 2003-2015. For each day, we defined treatment as concordant if the isolated organism was susceptible to at least 1 antibiotic given. Among 1,203 patients with Acinetobacter species hospital-acquired bacteremia, 682 had 1 or more days of delays to concordant treatment. Surprisingly, crude 30-day mortality was lower in patients with delays of ≥3 days compared with those who had 1-2 days of delays. Accounting for confounders and immortal time bias resolved this paradox. Emulating a target trial, we found that these delays were associated with an absolute increase in expected 30-day mortality of 6.6% (95% confidence interval: 0.2, 13.0), from 33.8% to 40.4%.


Subject(s)
Acinetobacter Infections/mortality , Anti-Bacterial Agents/therapeutic use , Bacteremia/mortality , Cross Infection/mortality , Time-to-Treatment/statistics & numerical data , Acinetobacter Infections/drug therapy , Adult , Aged , Bacteremia/drug therapy , Cross Infection/drug therapy , Female , Hospital Mortality , Humans , Male , Middle Aged , Randomized Controlled Trials as Topic , Retrospective Studies , Thailand/epidemiology
11.
J Infect ; 82(3): 355-362, 2021 03.
Article in English | MEDLINE | ID: mdl-33278401

ABSTRACT

OBJECTIVES: The magnitude of impact caused by low blood culture utilization on estimates of the proportions and incidence rates of antimicrobial-resistant (AMR) bacterial infections is largely unknown. METHODS: We used routine electronic databases of microbiology, hospital admission and drug prescription at Sunpasitthiprasong Hospital, Ubon Ratchathani, Thailand, from 2011 to 2015, and bootstrap simulations. RESULTS: The proportions of Escherichia coli and Klebsiella pneumoniae bacteraemias caused by 3rd generation cephalosporin resistant isolates (3GCREC and 3GCRKP) were estimated to increase by 13 and 24 percentage points (from 44% to 57% and from 51% to 75%), respectively, if blood culture utilization rate was reduced from 82 to 26 blood culture specimens per 1,000 patient-days. Among patients with hospital-origin bloodstream infections, the proportion of 3GCREC and 3GCRKP whose first positive blood culture was taken within ±1 calendar day of the start of a parenteral antibiotic at the study hospital was substantially lower than those whose first positive blood culture was taken later into parenteral antibiotic treatment (30% versus 79%, p<0.001; and 37% versus 86%, p<0.001). Similar effects were observed for methicillin-resistant Staphylococcus aureus, carbapenem-resistant Acinetobacter spp. and carbapenem-resistant Pseudomonas aeruginosa. CONCLUSION: Impacts of low blood culture utilization rate on the estimated proportions and incidence rates of AMR infections could be high. We recommend that AMR surveillance reports should additionally include blood culture utilization rate and stratification by exposure to a parenteral antibiotic at the hospital.


Subject(s)
Anti-Bacterial Agents , Methicillin-Resistant Staphylococcus aureus , Anti-Bacterial Agents/pharmacology , Anti-Bacterial Agents/therapeutic use , Blood Culture , Drug Resistance, Bacterial , Humans , Microbial Sensitivity Tests , Thailand
12.
J Med Internet Res ; 22(10): e19762, 2020 10 02.
Article in English | MEDLINE | ID: mdl-33006570

ABSTRACT

BACKGROUND: Reporting cumulative antimicrobial susceptibility testing data on a regular basis is crucial to inform antimicrobial resistance (AMR) action plans at local, national, and global levels. However, analyzing data and generating a report are time consuming and often require trained personnel. OBJECTIVE: This study aimed to develop and test an application that can support a local hospital to analyze routinely collected electronic data independently and generate AMR surveillance reports rapidly. METHODS: An offline application to generate standardized AMR surveillance reports from routinely available microbiology and hospital data files was written in the R programming language (R Project for Statistical Computing). The application can be run by double clicking on the application file without any further user input. The data analysis procedure and report content were developed based on the recommendations of the World Health Organization Global Antimicrobial Resistance Surveillance System (WHO GLASS). The application was tested on Microsoft Windows 10 and 7 using open access example data sets. We then independently tested the application in seven hospitals in Cambodia, Lao People's Democratic Republic, Myanmar, Nepal, Thailand, the United Kingdom, and Vietnam. RESULTS: We developed the AutoMated tool for Antimicrobial resistance Surveillance System (AMASS), which can support clinical microbiology laboratories to analyze their microbiology and hospital data files (in CSV or Excel format) onsite and promptly generate AMR surveillance reports (in PDF and CSV formats). The data files could be those exported from WHONET or other laboratory information systems. The automatically generated reports contain only summary data without patient identifiers. The AMASS application is downloadable from https://www.amass.website/. The participating hospitals tested the application and deposited their AMR surveillance reports in an open access data repository. CONCLUSIONS: The AMASS is a useful tool to support the generation and sharing of AMR surveillance reports.


Subject(s)
Drug Resistance, Bacterial/drug effects , Hospitals/statistics & numerical data , Epidemiological Monitoring , Humans , Proof of Concept Study
14.
Wellcome Open Res ; 4: 207, 2019.
Article in English | MEDLINE | ID: mdl-32420455

ABSTRACT

Protocol non-adherence is common and poses unique challenges in the interpretation of trial outcomes, especially in non-inferiority trials. We performed simulations of a non-inferiority trial with a time-fixed treatment and a binary endpoint in order to: i) explore the impact of various patterns of non-adherence and analysis methods on treatment effect estimates; ii) quantify the probability of claiming non-inferiority when the experimental treatment effect is actually inferior; and iii) evaluate alternative methods such as inverse probability weighting and instrumental variable estimation. We found that the probability of concluding non-inferiority when the experimental treatment is actually inferior depends on whether non-adherence is due to confounding or non-confounding factors, and the actual treatments received by the non-adherent participants. With non-adherence, intention-to-treat analysis has a higher tendency to conclude non-inferiority when the experimental treatment is actually inferior under most patterns of non-adherence. This probability of concluding non-inferiority can be increased to as high as 0.1 from 0.025 when the adherence is relatively high at 90%. The direction of bias for the per-protocol analysis depends on the directions of influence the confounders have on adherence and probability of outcome. The inverse probability weighting approach can reduce bias but will only eliminate it if all confounders can be measured without error and are appropriately adjusted for. Instrumental variable estimation overcomes this limitation and gives unbiased estimates even when confounders are not known, but typically requires large sample sizes to achieve acceptable power. Investigators need to consider patterns of non-adherence and potential confounders in trial designs. Adjusted analysis of the per-protocol population with sensitivity analyses on confounders and other approaches, such as instrumental variable estimation, should be considered when non-compliance is anticipated. We provide an online power calculator allowing for various patterns of non-adherence using the above methods.

15.
PLoS Negl Trop Dis ; 10(12): e0005204, 2016 12.
Article in English | MEDLINE | ID: mdl-27973567

ABSTRACT

BACKGROUND: Culture is the gold standard for the detection of environmental B. pseudomallei. In general, soil specimens are cultured in enrichment broth for 2 days, and then the culture broth is streaked on an agar plate and incubated further for 7 days. However, identifying B. pseudomallei on the agar plates among other soil microbes requires expertise and experience. Here, we evaluate a lateral flow immunoassay (LFI) developed to detect B. pseudomallei capsular polysaccharide (CPS) in clinical samples as a tool to detect B. pseudomallei in environmental samples. METHODOLOGY/PRINCIPAL FINDINGS: First, we determined the limit of detection (LOD) of LFI for enrichment broth of the soil specimens. Soil specimens (10 grams/specimen) culture negative for B. pseudomallei were spiked with B. pseudomallei ranging from 10 to 105 CFU, and incubated in 10 ml of enrichment broth in air at 40°C. Then, on day 2, 4 and 7 of incubation, 50 µL of the upper layer of the broth were tested on the LFI, and colony counts to determine quantity of B. pseudomallei in the broth were performed. We found that all five soil specimens inoculated at 10 CFU were negative by LFI on day 2, but four of those five specimens were LFI positive on day 7. The LOD of the LFI was estimated to be roughly 3.8x106 CFU/ml, and culture broth on day 7 was selected as the optimal sample for LFI testing. Second, we evaluated the utility of the LFI by testing 105 soil samples from Northeast Thailand. All samples were also tested by standard culture and quantitative PCR (qPCR) targeting orf2. Of 105 soil samples, 35 (33%) were LFI positive, 25 (24%) were culture positive for B. pseudomallei, and 79 (75%) were qPCR positive. Of 11 LFI positive but standard culture negative specimens, six were confirmed by having the enrichment broth on day 7 culture positive for B. pseudomallei, and an additional three by qPCR. The LFI had 97% (30/31) sensitivity to detect soil specimens culture positive for B. pseudomallei. CONCLUSIONS/SIGNIFICANCE: The LFI can be used to detect B. pseudomallei in soil samples, and to select which samples should be sent to reference laboratories or proceed further for bacterial isolation and confirmation. This could considerably decrease laboratory workload and assist the development of a risk map for melioidosis in resource-limited settings.


Subject(s)
Burkholderia pseudomallei/isolation & purification , Immunoassay/methods , Soil Microbiology , Burkholderia pseudomallei/chemistry , Burkholderia pseudomallei/immunology , Humans , Immunoassay/standards , Limit of Detection , Polysaccharides, Bacterial/immunology , Polysaccharides, Bacterial/isolation & purification , Real-Time Polymerase Chain Reaction , Sensitivity and Specificity , Thailand
16.
Appl Environ Microbiol ; 82(24): 7086-7092, 2016 12 15.
Article in English | MEDLINE | ID: mdl-27694236

ABSTRACT

Burkholderia pseudomallei is a soil-dwelling bacterium and the cause of melioidosis, which kills an estimated 89,000 people per year worldwide. Agricultural workers are at high risk of infection due to repeated exposure to the bacterium. Little is known about the soil physicochemical properties associated with the presence or absence of the organism. Here, we evaluated the soil physicochemical properties and presence of B. pseudomallei in 6,100 soil samples collected from 61 rice fields in Thailand. The presence of B. pseudomallei was negatively associated with the proportion of clay, proportion of moisture, level of salinity, percentage of organic matter, presence of cadmium, and nutrient levels (phosphorus, potassium, calcium, magnesium, and iron). The presence of B. pseudomallei was not associated with the level of soil acidity (P = 0.54). In a multivariable logistic regression model, the presence of B. pseudomallei was negatively associated with the percentage of organic matter (odds ratio [OR], 0.06; 95% confidence interval [CI], 0.01 to 0.47; P = 0.007), level of salinity (OR, 0.06; 95% CI, 0.01 to 0.74; P = 0.03), and percentage of soil moisture (OR, 0.81; 95% CI, 0.66 to 1.00; P = 0.05). Our study suggests that B. pseudomallei thrives in rice fields that are nutrient depleted. Some agricultural practices result in a decline in soil nutrients, which may impact the presence and amount of B. pseudomallei bacteria in affected areas. IMPORTANCE: Burkholderia pseudomallei is an environmental Gram-negative bacillus and the cause of melioidosis. Humans acquire the disease following skin inoculation, inhalation, or ingestion of the bacterium in the environment. The presence of B. pseudomallei in soil defines geographic regions where humans and livestock are at risk of melioidosis, yet little is known about the soil properties associated with the presence of the organism. We evaluated the soil properties and presence of B. pseudomallei in 61 rice fields in East, Central, and Northeast Thailand. We demonstrated that the organism was more commonly found in soils with lower levels of organic matter and nutrients, including phosphorus, potassium, calcium, magnesium, and iron. We also demonstrated that crop residue burning after harvest, which can reduce soil nutrients, was not uncommon. Some agricultural practices result in a decline in soil nutrients, which may impact the presence and amount of B. pseudomallei bacteria in affected areas.


Subject(s)
Burkholderia pseudomallei/isolation & purification , Soil Microbiology , Soil/chemistry , Burkholderia pseudomallei/classification , Burkholderia pseudomallei/genetics , Environment , Oryza/growth & development , Salinity , Thailand
17.
Elife ; 52016 09 06.
Article in English | MEDLINE | ID: mdl-27599374

ABSTRACT

Little is known about the excess mortality caused by multidrug-resistant (MDR) bacterial infection in low- and middle-income countries (LMICs). We retrospectively obtained microbiology laboratory and hospital databases of nine public hospitals in northeast Thailand from 2004 to 2010, and linked these with the national death registry to obtain the 30-day mortality outcome. The 30-day mortality in those with MDR community-acquired bacteraemia, healthcare-associated bacteraemia, and hospital-acquired bacteraemia were 35% (549/1555), 49% (247/500), and 53% (640/1198), respectively. We estimate that 19,122 of 45,209 (43%) deaths in patients with hospital-acquired infection due to MDR bacteria in Thailand in 2010 represented excess mortality caused by MDR. We demonstrate that national statistics on the epidemiology and burden of MDR in LMICs could be improved by integrating information from readily available databases. The prevalence and mortality attributable to MDR in Thailand are high. This is likely to reflect the situation in other LMICs.


Subject(s)
Bacteremia/epidemiology , Bacteria/drug effects , Community-Acquired Infections/epidemiology , Cross Infection/epidemiology , Drug Resistance, Multiple, Bacterial , Bacteremia/microbiology , Bacteremia/mortality , Bacteria/classification , Bacteria/isolation & purification , Community-Acquired Infections/microbiology , Community-Acquired Infections/mortality , Cost of Illness , Cross Infection/microbiology , Cross Infection/mortality , Developing Countries , Hospitals , Humans , Microbial Sensitivity Tests , Retrospective Studies , Survival Analysis , Thailand/epidemiology
18.
J Clin Microbiol ; 54(6): 1472-1478, 2016 06.
Article in English | MEDLINE | ID: mdl-27008880

ABSTRACT

The enzyme-linked immunosorbent assay (ELISA) has been proposed as an alternative serologic diagnostic test to the indirect immunofluorescence assay (IFA) for scrub typhus. Here, we systematically determine the optimal sample dilution and cutoff optical density (OD) and estimate the accuracy of IgM ELISA using Bayesian latent class models (LCMs). Data from 135 patients with undifferentiated fever were reevaluated using Bayesian LCMs. Every patient was evaluated for the presence of an eschar and tested with a blood culture for Orientia tsutsugamushi, three different PCR assays, and an IgM IFA. The IgM ELISA was performed for every sample at sample dilutions from 1:100 to 1:102,400 using crude whole-cell antigens of the Karp, Kato, and Gilliam strains of O. tsutsugamushi developed by the Naval Medical Research Center. We used Bayesian LCMs to generate unbiased receiver operating characteristic curves and found that the sample dilution of 1:400 was optimal for the IgM ELISA. With the optimal cutoff OD of 1.474 at a sample dilution of 1:400, the IgM ELISA had a sensitivity of 85.7% (95% credible interval [CrI], 77.4% to 86.7%) and a specificity of 98.1% (95% CrI, 97.2% to 100%) using paired samples. For the ELISA, the OD could be determined objectively and quickly, in contrast to the reading of IFA slides, which was both subjective and labor-intensive. The IgM ELISA for scrub typhus has high diagnostic accuracy and is less subjective than the IgM IFA. We suggest that the IgM ELISA may be used as an alternative reference test to the IgM IFA for the serological diagnosis of scrub typhus.


Subject(s)
Antibodies, Bacterial/blood , Enzyme-Linked Immunosorbent Assay/methods , Immunoglobulin M/blood , Orientia tsutsugamushi/immunology , Scrub Typhus/diagnosis , Serologic Tests/methods , Adolescent , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Prospective Studies , ROC Curve , Sensitivity and Specificity , Thailand , Young Adult
19.
J Clin Microbiol ; 53(11): 3663-6, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26354819

ABSTRACT

We determined the optimal cutoff titers in admission and convalescent-phase samples for scrub typhus indirect immunofluorescence assay using Bayesian latent class models. Cutoff titers of ≥1:3,200 in an admission sample or of a ≥4-fold rise to ≥1:3,200 in a convalescent-phase sample provided the highest accuracy (sensitivity, 81.6%; specificity, 100%).


Subject(s)
Fluorescent Antibody Technique, Indirect/methods , Orientia tsutsugamushi/immunology , Scrub Typhus/diagnosis , Adult , Antibodies, Bacterial/blood , Antibodies, Bacterial/immunology , Female , Humans , Immunoglobulin M/blood , Immunoglobulin M/immunology , Male , Middle Aged , ROC Curve , Scrub Typhus/microbiology
20.
PLoS One ; 10(5): e0114930, 2015.
Article in English | MEDLINE | ID: mdl-26024375

ABSTRACT

BACKGROUND: The indirect immunofluorescence assay (IFA) is considered a reference test for scrub typhus. Recently, the Scrub Typhus Infection Criteria (STIC; a combination of culture, PCR assays and IFA IgM) were proposed as a reference standard for evaluating alternative diagnostic tests. Here, we use Bayesian latent class models (LCMs) to estimate the true accuracy of each diagnostic test, and of STIC, for diagnosing scrub typhus. METHODS/PRINCIPAL FINDINGS: Data from 161 patients with undifferentiated fever were re-evaluated using Bayesian LCMs. Every patient was evaluated for the presence of an eschar, and tested with blood culture for Orientia tsutsugamushi, three different PCR assays, IFA IgM, and the Panbio IgM immunochromatographic test (ICT). True sensitivity and specificity of culture (24.4% and 100%), 56kDa PCR assay (56.8% and 98.4%), 47kDa PCR assay (63.2% and 96.1%), groEL PCR assay (71.4% and 93.0%), IFA IgM (70.0% and 83.8%), PanBio IgM ICT (72.8% and 96.8%), presence of eschar (42.7% and 98.9%) and STIC (90.5% and 82.5%) estimated by Bayesian LCM were considerably different from those obtained when using STIC as a reference standard. The IgM ICT had comparable sensitivity and significantly higher specificity compared to IFA (p=0.34 and p<0.001, respectively). CONCLUSIONS: The low specificity of STIC was caused by the low specificity of IFA IgM. Neither STIC nor IFA IgM can be used as reference standards against which to evaluate alternative diagnostic tests. Further evaluation of new diagnostic tests should be done with a carefully selected set of diagnostic tests and appropriate statistical models.


Subject(s)
Diagnostic Tests, Routine/standards , Scrub Typhus/diagnosis , Adolescent , Adult , Aged , Aged, 80 and over , Bayes Theorem , Female , Fluorescent Antibody Technique, Indirect , Humans , Immunoglobulin M/analysis , Male , Middle Aged , Polymerase Chain Reaction , Sensitivity and Specificity , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...