Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
1.
Nat Commun ; 15(1): 3777, 2024 May 06.
Article in English | MEDLINE | ID: mdl-38710683

ABSTRACT

Liquid Chromatography Mass Spectrometry (LC-MS) is a powerful method for profiling complex biological samples. However, batch effects typically arise from differences in sample processing protocols, experimental conditions, and data acquisition techniques, significantly impacting the interpretability of results. Correcting batch effects is crucial for the reproducibility of omics research, but current methods are not optimal for the removal of batch effects without compressing the genuine biological variation under study. We propose a suite of Batch Effect Removal Neural Networks (BERNN) to remove batch effects in large LC-MS experiments, with the goal of maximizing sample classification performance between conditions. More importantly, these models must efficiently generalize in batches not seen during training. A comparison of batch effect correction methods across five diverse datasets demonstrated that BERNN models consistently showed the strongest sample classification performance. However, the model producing the greatest classification improvements did not always perform best in terms of batch effect removal. Finally, we show that the overcorrection of batch effects resulted in the loss of some essential biological variability. These findings highlight the importance of balancing batch effect removal while preserving valuable biological diversity in large-scale LC-MS experiments.


Subject(s)
Mass Spectrometry , Neural Networks, Computer , Chromatography, Liquid/methods , Mass Spectrometry/methods , Humans , Reproducibility of Results , Liquid Chromatography-Mass Spectrometry
2.
Res Sq ; 2023 Jul 06.
Article in English | MEDLINE | ID: mdl-37461653

ABSTRACT

Liquid Chromatography Mass Spectrometry (LC-MS) is a powerful method for profiling complex biological samples. However, batch effects typically arise from differences in sample processing protocols, experimental conditions and data acquisition techniques, significantlyimpacting the interpretability of results. Correcting batch effects is crucial for the reproducibility of proteomics research, but current methods are not optimal for removal of batch effects without compressing the genuine biological variation under study. We propose a suite of Batch Effect Removal Neural Networks (BERNN) to remove batch effects in large LC-MS experiments, with the goal of maximizing sample classification performance between conditions. More importantly, these models must efficiently generalize in batches not seen during training. Comparison of batch effect correction methods across three diverse datasets demonstrated that BERNN models consistently showed the strongest sample classification performance. However, the model producing the greatest classification improvements did not always perform best in terms of batch effect removal. Finally, we show that overcorrection of batch effects resulted in the loss of some essential biological variability. These findings highlight the importance of balancing batch effect removal while preserving valuable biological diversity in large-scale LC-MS experiments.

3.
IEEE Trans Pattern Anal Mach Intell ; 44(9): 5681-5699, 2022 09.
Article in English | MEDLINE | ID: mdl-33819149

ABSTRACT

We consider predicting the user's head motion in 360 ° videos, with 2 modalities only: the past user's positions and the video content (not knowing other users' traces). We make two main contributions. First, we re-examine existing deep-learning approaches for this problem and identify hidden flaws from a thorough root-cause analysis. Second, from the results of this analysis, we design a new proposal establishing state-of-the-art performance. First, re-assessing the existing methods that use both modalities, we obtain the surprising result that they all perform worse than baselines using the user's trajectory only. A root-cause analysis of the metrics, datasets and neural architectures shows in particular that (i) the content can inform the prediction for horizons longer than 2 to 3 sec. (existing methods consider shorter horizons), and that (ii) to compete with the baselines, it is necessary to have a recurrent unit dedicated to process the positions, but this is not sufficient. Second, from a re-examination of the problem supported with the concept of Structural-RNN, we design a new deep neural architecture, named TRACK. TRACK achieves state-of-the-art performance on all considered datasets and prediction horizons, outperforming competitors by up to 20 percent on focus-type videos and horizons 2-5 seconds. The entire framework (codes and datasets) is online and received an ACM reproducibility badge https://gitlab.com/miguelfromeror/head-motion-prediction.


Subject(s)
Algorithms , Neural Networks, Computer , Motion , Reproducibility of Results
4.
Nat Commun ; 11(1): 5595, 2020 11 05.
Article in English | MEDLINE | ID: mdl-33154370

ABSTRACT

Rapid and accurate clinical diagnosis remains challenging. A component of diagnosis tool development is the design of effective classification models with Mass spectrometry (MS) data. Some Machine Learning approaches have been investigated but these models require time-consuming preprocessing steps to remove artifacts, making them unsuitable for rapid analysis. Convolutional Neural Networks (CNNs) have been found to perform well under such circumstances since they can learn representations from raw data. However, their effectiveness decreases when the number of available training samples is small, which is a common situation in medicine. In this work, we investigate transfer learning on 1D-CNNs, then we develop a cumulative learning method when transfer learning is not powerful enough. We propose to train the same model through several classification tasks over various small datasets to accumulate knowledge in the resulting representation. By using rat brain as the initial training dataset, a cumulative learning approach can have a classification accuracy exceeding 98% for 1D clinical MS-data. We show the use of cumulative learning using datasets generated in different biological contexts, on different organisms, and acquired by different instruments. Here we show a promising strategy for improving MS data classification accuracy when only small numbers of samples are available.


Subject(s)
Deep Learning , Mass Spectrometry/methods , Neural Networks, Computer , Animals , Databases, Factual , Diagnosis, Computer-Assisted , Humans , Machine Learning , Mass Spectrometry/statistics & numerical data
5.
Med Image Anal ; 20(1): 237-48, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25547073

ABSTRACT

The proliferative activity of breast tumors, which is routinely estimated by counting of mitotic figures in hematoxylin and eosin stained histology sections, is considered to be one of the most important prognostic markers. However, mitosis counting is laborious, subjective and may suffer from low inter-observer agreement. With the wider acceptance of whole slide images in pathology labs, automatic image analysis has been proposed as a potential solution for these issues. In this paper, the results from the Assessment of Mitosis Detection Algorithms 2013 (AMIDA13) challenge are described. The challenge was based on a data set consisting of 12 training and 11 testing subjects, with more than one thousand annotated mitotic figures by multiple observers. Short descriptions and results from the evaluation of eleven methods are presented. The top performing method has an error rate that is comparable to the inter-observer agreement among pathologists.


Subject(s)
Algorithms , Breast Neoplasms/pathology , Mitosis , Female , Humans , Observer Variation
6.
IEEE Trans Pattern Anal Mach Intell ; 34(2): 402-9, 2012 Feb.
Article in English | MEDLINE | ID: mdl-21968915

ABSTRACT

In the past 10 years, new powerful algorithms based on efficient data structures have been proposed to solve the problem of Nearest Neighbors search (or Approximate Nearest Neighbors search). If the Euclidean Locality Sensitive Hashing algorithm, which provides approximate nearest neighbors in a euclidean space with sublinear complexity, is probably the most popular, the euclidean metric does not always provide as accurate and as relevant results when considering similarity measure as the Earth-Mover Distance and 2 distances. In this paper, we present a new LSH scheme adapted to 2 distance for approximate nearest neighbors search in high-dimensional spaces. We define the specific hashing functions, we prove their local-sensitivity, and compare, through experiments, our method with the Euclidean Locality Sensitive Hashing algorithm in the context of image retrieval on real image databases. The results prove the relevance of such a new LSH scheme either providing far better accuracy in the context of image retrieval than euclidean scheme for an equivalent speed, or providing an equivalent accuracy but with a high gain in terms of processing speed.

7.
IEEE Trans Image Process ; 14(7): 910-24, 2005 Jul.
Article in English | MEDLINE | ID: mdl-16028555

ABSTRACT

This paper deals with fast image and video segmentation using active contours. Region-based active contours using level sets are powerful techniques for video segmentation, but they suffer from large computational cost. A parametric active contour method based on B-Spline interpolation has been proposed in to highly reduce the computational cost, but this method is sensitive to noise. Here, we choose to relax the rigid interpolation constraint in order to robustify our method in the presence of noise: by using smoothing splines, we trade a tunable amount of interpolation error for a smoother spline curve. We show by experiments on natural sequences that this new flexibility yields segmentation results of higher quality at no additional computational cost. Hence, real-time processing for moving objects segmentation is preserved.


Subject(s)
Algorithms , Artificial Intelligence , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Information Storage and Retrieval/methods , Pattern Recognition, Automated/methods , Video Recording/methods , Numerical Analysis, Computer-Assisted , Signal Processing, Computer-Assisted
8.
Appl Opt ; 43(2): 247-56, 2004 Jan 10.
Article in English | MEDLINE | ID: mdl-14735944

ABSTRACT

Image and sequence segmentation of a the segmentation task are discussed from the point of view of optimizing the segmentation criterion. Such a segmentation criterion involves so-called (boundary and region) descriptors, which, in general, may depend on their respective boundaries or regions. This dependency must be taken into account when one is computing the criterion derivative with respect to the unknown object domain (defined by its boundary). If this dependency not considered, some correctional terms may be omitted. Computing the derivative of the segmentation criterion with a dynamic scheme is described. The scheme is general enough to provide a framework for a wide variety of applications in segmentation. It also provides a theoretical meaning to the philosophy of active contours.

SELECTION OF CITATIONS
SEARCH DETAIL
...