Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
Brief Bioinform ; 25(2)2024 Jan 22.
Article in English | MEDLINE | ID: mdl-38483256

ABSTRACT

Numerous imaging techniques are available for observing and interrogating biological samples, and several of them can be used consecutively to enable correlative analysis of different image modalities with varying resolutions and the inclusion of structural or molecular information. Achieving accurate registration of multimodal images is essential for the correlative analysis process, but it remains a challenging computer vision task with no widely accepted solution. Moreover, supervised registration methods require annotated data produced by experts, which is limited. To address this challenge, we propose a general unsupervised pipeline for multimodal image registration using deep learning. We provide a comprehensive evaluation of the proposed pipeline versus the current state-of-the-art image registration and style transfer methods on four types of biological problems utilizing different microscopy modalities. We found that style transfer of modality domains paired with fully unsupervised training leads to comparable image registration accuracy to supervised methods and, most importantly, does not require human intervention.


Subject(s)
Deep Learning , Humans , Microscopy
2.
Biosensors (Basel) ; 13(2)2023 Jan 26.
Article in English | MEDLINE | ID: mdl-36831953

ABSTRACT

Nowadays, morphology and molecular analyses at the single-cell level have a fundamental role in understanding biology better. These methods are utilized for cell phenotyping and in-depth studies of cellular processes, such as mitosis. Fluorescence microscopy and optical spectroscopy techniques, including Raman micro-spectroscopy, allow researchers to examine biological samples at the single-cell level in a non-destructive manner. Fluorescence microscopy can give detailed morphological information about the localization of stained molecules, while Raman microscopy can produce label-free images at the subcellular level; thus, it can reveal the spatial distribution of molecular fingerprints, even in live samples. Accordingly, the combination of correlative fluorescence and Raman microscopy (CFRM) offers a unique approach for studying cellular stages at the single-cell level. However, subcellular spectral maps are complex and challenging to interpret. Artificial intelligence (AI) may serve as a valuable solution to characterize the molecular backgrounds of phenotypes and biological processes by finding the characteristic patterns in spectral maps. The major contributions of the manuscript are: (I) it gives a comprehensive review of the literature focusing on AI techniques in Raman-based cellular phenotyping; (II) via the presentation of a case study, a new neural network-based approach is described, and the opportunities and limitations of AI, specifically deep learning, are discussed regarding the analysis of Raman spectroscopy data to classify mitotic cellular stages based on their spectral maps.


Subject(s)
Artificial Intelligence , Spectrum Analysis, Raman , Microscopy, Fluorescence/methods , Spectrum Analysis, Raman/methods
3.
Sci Rep ; 11(1): 14813, 2021 07 20.
Article in English | MEDLINE | ID: mdl-34285291

ABSTRACT

Recent statistics report that more than 3.7 million new cases of cancer occur in Europe yearly, and the disease accounts for approximately 20% of all deaths. High-throughput screening of cancer cell cultures has dominated the search for novel, effective anticancer therapies in the past decades. Recently, functional assays with patient-derived ex vivo 3D cell culture have gained importance for drug discovery and precision medicine. We recently evaluated the major advancements and needs for the 3D cell culture screening, and concluded that strictly standardized and robust sample preparation is the most desired development. Here we propose an artificial intelligence-guided low-cost 3D cell culture delivery system. It consists of a light microscope, a micromanipulator, a syringe pump, and a controller computer. The system performs morphology-based feature analysis on spheroids and can select uniform sized or shaped spheroids to transfer them between various sample holders. It can select the samples from standard sample holders, including Petri dishes and microwell plates, and then transfer them to a variety of holders up to 384 well plates. The device performs reliable semi- and fully automated spheroid transfer. This results in highly controlled experimental conditions and eliminates non-trivial side effects of sample variability that is a key aspect towards next-generation precision medicine.


Subject(s)
Cell Culture Techniques/instrumentation , Neoplasms/pathology , Spheroids, Cellular/cytology , Artificial Intelligence , Cell Line, Tumor , Deep Learning , Drug Screening Assays, Antitumor , High-Throughput Screening Assays , Humans , Neoplasms/drug therapy , Precision Medicine , Spheroids, Cellular/drug effects , Spheroids, Cellular/pathology
4.
Micromachines (Basel) ; 11(9)2020 Sep 22.
Article in English | MEDLINE | ID: mdl-32972024

ABSTRACT

A cell elasticity measurement method is introduced that uses polymer microtools actuated by holographic optical tweezers. The microtools were prepared with two-photon polymerization. Their shape enables the approach of the cells in any lateral direction. In the presented case, endothelial cells grown on vertical polymer walls were probed by the tools in a lateral direction. The use of specially shaped microtools prevents the target cells from photodamage that may arise during optical trapping. The position of the tools was recorded simply with video microscopy and analyzed with image processing methods. We critically compare the resulting Young's modulus values to those in the literature obtained by other methods. The application of optical tweezers extends the force range available for cell indentations measurements down to the fN regime. Our approach demonstrates a feasible alternative to the usual vertical indentation experiments.

5.
Biomed Opt Express ; 11(2): 945-962, 2020 Feb 01.
Article in English | MEDLINE | ID: mdl-32133231

ABSTRACT

Fluorescent observation of cells generally suffers from the limited axial resolution due to the elongated point spread function of the microscope optics. Consequently, three-dimensional imaging results in axial resolution that is several times worse than the transversal. The optical solutions to this problem usually require complicated optics and extreme spatial stability. A straightforward way to eliminate anisotropic resolution is to fuse images recorded from multiple viewing directions achieved mostly by the mechanical rotation of the entire sample. In the presented approach, multiview imaging of single cells is implemented by rotating them around an axis perpendicular to the optical axis by means of holographic optical tweezers. For this, the cells are indirectly trapped and manipulated with special microtools made with two-photon polymerization. The cell is firmly attached to the microtool and is precisely manipulated with 6 degrees of freedom. The total control over the cells' position allows for its multiview fluorescence imaging from arbitrarily selected directions. The image stacks obtained this way are combined into one 3D image array with a multiview image processing pipeline resulting in isotropic optical resolution that approaches the lateral diffraction limit. The presented tool and manipulation scheme can be readily applied in various microscope platforms.

6.
Cell Syst ; 10(5): 453-458.e6, 2020 05 20.
Article in English | MEDLINE | ID: mdl-34222682

ABSTRACT

Single-cell segmentation is typically a crucial task of image-based cellular analysis. We present nucleAIzer, a deep-learning approach aiming toward a truly general method for localizing 2D cell nuclei across a diverse range of assays and light microscopy modalities. We outperform the 739 methods submitted to the 2018 Data Science Bowl on images representing a variety of realistic conditions, some of which were not represented in the training data. The key to our approach is that during training nucleAIzer automatically adapts its nucleus-style model to unseen and unlabeled data using image style transfer to automatically generate augmented training samples. This allows the model to recognize nuclei in new and different experiments efficiently without requiring expert annotations, making deep learning for nucleus segmentation fairly simple and labor free for most biological light microscopy experiments. It can also be used online, integrated into CellProfiler and freely downloaded at www.nucleaizer.org. A record of this paper's transparent peer review process is included in the Supplemental Information.


Subject(s)
Cell Nucleus , Deep Learning , Microscopy
SELECTION OF CITATIONS
SEARCH DETAIL
...