Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
1.
PLoS One ; 12(8): e0183478, 2017.
Artigo em Inglês | MEDLINE | ID: mdl-28854255

RESUMO

The rapid pace of bacterial evolution enables organisms to adapt to the laboratory environment with repeated passage and thus diverge from naturally-occurring environmental ("wild") strains. Distinguishing wild and laboratory strains is clearly important for biodefense and bioforensics; however, DNA sequence data alone has thus far not provided a clear signature, perhaps due to lack of understanding of how diverse genome changes lead to convergent phenotypes, difficulty in detecting certain types of mutations, or perhaps because some adaptive modifications are epigenetic. Monitoring protein abundance, a molecular measure of phenotype, can overcome some of these difficulties. We have assembled a collection of Yersinia pestis proteomics datasets from our own published and unpublished work, and from a proteomics data archive, and demonstrated that protein abundance data can clearly distinguish laboratory-adapted from wild. We developed a lasso logistic regression classifier that uses binary (presence/absence) or quantitative protein abundance measures to predict whether a sample is laboratory-adapted or wild that proved to be ~98% accurate, as judged by replicated 10-fold cross-validation. Protein features selected by the classifier accord well with our previous study of laboratory adaptation in Y. pestis. The input data was derived from a variety of unrelated experiments and contained significant confounding variables. We show that the classifier is robust with respect to these variables. The methodology is able to discover signatures for laboratory facility and culture medium that are largely independent of the signature of laboratory adaptation. Going beyond our previous laboratory evolution study, this work suggests that proteomic differences between laboratory-adapted and wild Y. pestis are general, potentially pointing to a process that could apply to other species as well. Additionally, we show that proteomics datasets (even archived data collected for different purposes) contain the information necessary to distinguish wild and laboratory samples. This work has clear applications in biomarker detection as well as biodefense.


Assuntos
Adaptação Fisiológica , Proteínas de Bactérias/metabolismo , Peste/microbiologia , Yersinia pestis/metabolismo , Técnicas Bacteriológicas , Microbiologia Ambiental , Humanos , Modelos Logísticos , Fenótipo , Peste/diagnóstico , Proteoma/metabolismo , Proteômica/métodos , Especificidade da Espécie , Yersinia pestis/classificação , Yersinia pestis/genética
2.
J Proteome Res ; 13(4): 2215-22, 2014 Apr 04.
Artigo em Inglês | MEDLINE | ID: mdl-24611607

RESUMO

Ensuring data quality and proper instrument functionality is a prerequisite for scientific investigation. Manual quality assurance is time-consuming and subjective. Metrics for describing liquid chromatography mass spectrometry (LC-MS) data have been developed; however, the wide variety of LC-MS instruments and configurations precludes applying a simple cutoff. Using 1150 manually classified quality control (QC) data sets, we trained logistic regression classification models to predict whether a data set is in or out of control. Model parameters were optimized by minimizing a loss function that accounts for the trade-off between false positive and false negative errors. The classifier models detected bad data sets with high sensitivity while maintaining high specificity. Moreover, the composite classifier was dramatically more specific than single metrics. Finally, we evaluated the performance of the classifier on a separate validation set where it performed comparably to the results for the testing/training data sets. By presenting the methods and software used to create the classifier, other groups can create a classifier for their specific QC regimen, which is highly variable lab-to-lab. In total, this manuscript presents 3400 LC-MS data sets for the same QC sample (whole cell lysate of Shewanella oneidensis), deposited to the ProteomeXchange with identifiers PXD000320-PXD000324.


Assuntos
Cromatografia Líquida/métodos , Cromatografia Líquida/normas , Espectrometria de Massas/métodos , Espectrometria de Massas/normas , Modelos Estatísticos , Controle de Qualidade , Reprodutibilidade dos Testes , Projetos de Pesquisa
3.
J Chromatogr A ; 1270: 269-82, 2012 Dec 28.
Artigo em Inglês | MEDLINE | ID: mdl-23177156

RESUMO

Dimethyl methylphosphonate (DMMP) was used as a chemical threat agent (CTA) simulant for a first look at the effects of real-world factors on the recovery and exploitation of a CTA's impurity profile for source matching. Four stocks of DMMP having different impurity profiles were disseminated as aerosols onto cotton, painted wall board, and nylon coupons according to a thorough experimental design. The DMMP-exposed coupons were then solvent extracted and analyzed for DMMP impurities by comprehensive 2D gas chromatography/mass spectrometry (GC×GC/MS). The similarities between the coupon DMMP impurity profiles and the known (reference) DMMP profiles were measured by dot products of the coupon profiles and known profiles and by score values obtained from principal component analysis. One stock, with a high impurity-profile selectivity value of 0.9 out of 1, had 100% of its respective coupons correctly classified and no false positives from other coupons. Coupons from the other three stocks with low selectivity values (0.0073, 0.012, and 0.018) could not be sufficiently distinguished from one another for reliable matching to their respective stocks. The results from this work support that: (1) extraction solvents, if not appropriately selected, can have some of the same impurities present in a CTA reducing a CTA's useable impurity profile, (2) low selectivity among a CTA's known impurity profiles will likely make definitive source matching impossible in some real-world conditions, (3) no detrimental chemical-matrix interference was encountered during the analysis of actual office media, (4) a short elapsed time between release and sample storage is advantageous for the recovery of the impurity profile because it minimizes volatilization of forensic impurities, and (5) forensic impurity profiles weighted toward higher volatility impurities are more likely to be altered by volatilization following CTA exposure.


Assuntos
Substâncias para a Guerra Química/análise , Monitoramento Ambiental/métodos , Poluentes Ambientais/análise , Ciências Forenses/métodos , Terrorismo Químico , Substâncias para a Guerra Química/química , Fibra de Algodão , Poluentes Ambientais/química , Cromatografia Gasosa-Espectrometria de Massas , Nylons/química , Compostos Organofosforados/análise , Compostos Organofosforados/química
4.
J Res Natl Inst Stand Technol ; 115(2): 113-47, 2010.
Artigo em Inglês | MEDLINE | ID: mdl-27134782

RESUMO

In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

5.
Stat Med ; 28(9): 1386-401, 2009 Apr 30.
Artigo em Inglês | MEDLINE | ID: mdl-19247982

RESUMO

We consider the monitoring of surgical outcomes, where each patient has a different risk of post-operative mortality due to risk factors that exist prior to the surgery. We propose a risk-adjusted (RA) survival time CUSUM chart (RAST CUSUM) for monitoring a continuous, time-to-event variable that may be right-censored. Risk adjustment is accomplished using accelerated failure time regression models. We compare the average run length performance of the RAST CUSUM chart with the RA Bernoulli CUSUM chart using data from cardiac surgeries to motivate the details of the comparison. The comparisons show that the RAST CUSUM chart is more efficient at detecting a sudden increase in the odds of mortality than the RA Bernoulli CUSUM chart, especially when the fraction of censored observations is relatively low or when a small increase in the odds of mortality occurs. We also discuss the impact of the amount of training data used to estimate chart parameters as well as the implementation of the RAST CUSUM chart during prospective monitoring.


Assuntos
Análise de Sobrevida , Biometria , Procedimentos Cirúrgicos Cardiovasculares/mortalidade , Bases de Dados Factuais , Humanos , Funções Verossimilhança , Modelos Logísticos , Modelos Estatísticos , Complicações Pós-Operatórias/mortalidade , Fatores de Risco
6.
Stat Med ; 27(8): 1225-47, 2008 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-17879266

RESUMO

A number of methods have been proposed for detecting an increase in the incidence rate of a rare health event, such as a congenital malformation. Among these are the sets method, two modifications of the sets method, and the CUSUM method based on the Poisson distribution. We consider the situation where data are observed as a sequence of Bernoulli trials and propose the Bernoulli CUSUM chart as a desirable method for the surveillance of rare health events. We compared the performance of the sets method and its modifications with that of the Bernoulli CUSUM chart under a wide variety of circumstances. Chart design parameters were chosen to satisfy a minimax criteria. We used the steady-state average run length to measure chart performance instead of the average run length (ARL), which was used in nearly all previous comparisons involving the sets method or its modifications. Except in a very few instances, we found that the Bernoulli CUSUM chart has better steady-state ARL performance than the sets method and its modifications for the extensive number of cases considered. Thus, we recommend the use of the Bernoulli CUSUM chart to monitor small incidence rates and provide practical advice for its implementation.


Assuntos
Anormalidades Congênitas/epidemiologia , Modelos Estatísticos , Vigilância da População/métodos , Humanos , Incidência , Recém-Nascido , Distribuição de Poisson , Projetos de Pesquisa
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...