Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
1.
Med Image Anal ; 90: 102940, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37666115

ABSTRACT

Cross-modality image synthesis is an active research topic with multiple medical clinically relevant applications. Recently, methods allowing training with paired but misaligned data have started to emerge. However, no robust and well-performing methods applicable to a wide range of real world data sets exist. In this work, we propose a generic solution to the problem of cross-modality image synthesis with paired but non-aligned data by introducing new deformation equivariance encouraging loss functions. The method consists of joint training of an image synthesis network together with separate registration networks and allows adversarial training conditioned on the input even with misaligned data. The work lowers the bar for new clinical applications by allowing effortless training of cross-modality image synthesis networks for more difficult data sets.

2.
Cancer Cell ; 41(9): 1543-1545, 2023 09 11.
Article in English | MEDLINE | ID: mdl-37652005

ABSTRACT

Artificial intelligence (AI) is rapidly gaining interest in medicine, including pathological assessments for personalized medicine. In this issue of Cancer Cell, Wagner et al. demonstrate superior accuracy of transformer-based deep learning in predicting biomarker status in CRC. The work has implications for increased efficiency and accuracy in clinical diagnostics guiding treatment decisions in precision oncology.


Subject(s)
Colorectal Neoplasms , Deep Learning , Humans , Biomarkers, Tumor , Artificial Intelligence , Precision Medicine
3.
Patterns (N Y) ; 4(5): 100725, 2023 May 12.
Article in English | MEDLINE | ID: mdl-37223268

ABSTRACT

Conventional histopathology has relied on chemical staining for over a century. The staining process makes tissue sections visible to the human eye through a tedious and labor-intensive procedure that alters the tissue irreversibly, preventing repeated use of the sample. Deep learning-based virtual staining can potentially alleviate these shortcomings. Here, we used standard brightfield microscopy on unstained tissue sections and studied the impact of increased network capacity on the resulting virtually stained H&E images. Using the generative adversarial neural network model pix2pix as a baseline, we observed that replacing simple convolutions with dense convolution units increased the structural similarity score, peak signal-to-noise ratio, and nuclei reproduction accuracy. We also demonstrated highly accurate reproduction of histology, especially with increased network capacity, and demonstrated applicability to several tissues. We show that network architecture optimization can improve the image translation accuracy of virtual H&E staining, highlighting the potential of virtual staining in streamlining histopathological analysis.

4.
Heliyon ; 8(1): e08762, 2022 Jan.
Article in English | MEDLINE | ID: mdl-35128089

ABSTRACT

Histological changes in tissue are of primary importance in pathological research and diagnosis. Automated histological analysis requires ability to computationally separate pathological alterations from normal tissue. Conventional histopathological assessments are performed from individual tissue sections, leading to the loss of three-dimensional context of the tissue. Yet, the tissue context and spatial determinants are critical in several pathologies, such as in understanding growth patterns of cancer in its local environment. Here, we develop computational methods for visualization and quantitative assessment of histopathological alterations in three dimensions. First, we reconstruct the 3D representation of the whole organ from serial sectioned tissue. Then, we proceed to analyze the histological characteristics and regions of interest in 3D. As our example cases, we use whole slide images representing hematoxylin-eosin stained whole mouse prostates in a Pten+/- mouse prostate tumor model. We show that quantitative assessment of tumor sizes, shapes, and separation between spatial locations within the organ enable characterizing and grouping tumors. Further, we show that 3D visualization of tissue with computationally quantified features provides an intuitive way to observe tissue pathology. Our results underline the heterogeneity in composition and cellular organization within individual tumors. As an example, we show how prostate tumors have nuclear density gradients indicating areas of tumor growth directions and reflecting varying pressure from the surrounding tissue. The methods presented here are applicable to any tissue and different types of pathologies. This work provides a proof-of-principle for gaining a comprehensive view from histology by studying it quantitatively in 3D.

5.
IEEE J Biomed Health Inform ; 25(5): 1747-1757, 2021 05.
Article in English | MEDLINE | ID: mdl-33211668

ABSTRACT

Nucleus detection is a fundamental task in histological image analysis and an important tool for many follow up analyses. It is known that sample preparation and scanning procedure of histological slides introduce a great amount of variability to the histological images and poses challenges for automated nucleus detection. Here, we studied the effect of histopathological sample fixation on the accuracy of a deep learning based nuclei detection model trained with hematoxylin and eosin stained images. We experimented with training data that includes three methods of fixation; PAXgene, formalin and frozen, and studied the detection accuracy results of various convolutional neural networks. Our results indicate that the variability introduced during sample preparation affects the generalization of a model and should be considered when building accurate and robust nuclei detection algorithms. Our dataset includes over 67 000 annotated nuclei locations from 16 patients and three different sample fixation types. The dataset provides excellent basis for building an accurate and robust nuclei detection model, and combined with unsupervised domain adaptation, the workflow allows generalization to images from unseen domains, including different tissues and images from different labs.


Subject(s)
Deep Learning , Cell Nucleus , Histological Techniques , Humans , Image Processing, Computer-Assisted , Neural Networks, Computer
6.
IEEE Trans Med Imaging ; 39(2): 534-542, 2020 02.
Article in English | MEDLINE | ID: mdl-31398111

ABSTRACT

Immunohistochemistry (IHC) of ER, PR, and Ki-67 are routinely used assays in breast cancer diagnostics. Determination of the proportion of stained cells (labeling index) should be restricted on malignant epithelial cells, carefully avoiding tumor infiltrating stroma and inflammatory cells. Here, we developed a deep learning based digital mask for automated epithelial cell detection using fluoro-chromogenic cytokeratin-Ki-67 double staining and sequential hematoxylin-IHC staining as training material. A partially pre-trained deep convolutional neural network was fine-tuned using image batches from 152 patient samples of invasive breast tumors. Validity of the trained digital epithelial cell masks was studied with 366 images captured from 98 unseen samples, by comparing the epithelial cell masks to cytokeratin images and by visual evaluation of the brightfield images performed by two pathologists. A good discrimination of epithelial cells was achieved (AUC of mean ROC = 0.93; defined as the area under mean receiver operating characteristics), and well in concordance with pathologists' visual assessment (4.01/5 and 4.67/5). The effect of epithelial cell masking on the Ki-67 labeling index was substantial. 52 tumor images initially classified as low proliferation (Ki-67 < 14%) without epithelial cell masking were re-classified as high proliferation (Ki-67 ≥ 14%) after applying the deep learning based epithelial cell mask. The digital epithelial cell masks were found applicable also to IHC of ER and PR. We conclude that deep learning can be applied to detect carcinoma cells in breast cancer samples stained with conventional brightfield IHC.


Subject(s)
Breast Neoplasms/diagnostic imaging , Deep Learning , Image Interpretation, Computer-Assisted/methods , Immunohistochemistry/methods , Keratins/analysis , Algorithms , Biomarkers, Tumor/analysis , Breast Neoplasms/chemistry , Breast Neoplasms/pathology , Epithelial Cells/chemistry , Female , Humans , Ki-67 Antigen/analysis , Receptors, Estrogen/analysis , Receptors, Progesterone/analysis
7.
JAMA ; 318(22): 2199-2210, 2017 12 12.
Article in English | MEDLINE | ID: mdl-29234806

ABSTRACT

Importance: Application of deep learning algorithms to whole-slide pathology images can potentially improve diagnostic accuracy and efficiency. Objective: Assess the performance of automated deep learning algorithms at detecting metastases in hematoxylin and eosin-stained tissue sections of lymph nodes of women with breast cancer and compare it with pathologists' diagnoses in a diagnostic setting. Design, Setting, and Participants: Researcher challenge competition (CAMELYON16) to develop automated solutions for detecting lymph node metastases (November 2015-November 2016). A training data set of whole-slide images from 2 centers in the Netherlands with (n = 110) and without (n = 160) nodal metastases verified by immunohistochemical staining were provided to challenge participants to build algorithms. Algorithm performance was evaluated in an independent test set of 129 whole-slide images (49 with and 80 without metastases). The same test set of corresponding glass slides was also evaluated by a panel of 11 pathologists with time constraint (WTC) from the Netherlands to ascertain likelihood of nodal metastases for each slide in a flexible 2-hour session, simulating routine pathology workflow, and by 1 pathologist without time constraint (WOTC). Exposures: Deep learning algorithms submitted as part of a challenge competition or pathologist interpretation. Main Outcomes and Measures: The presence of specific metastatic foci and the absence vs presence of lymph node metastasis in a slide or image using receiver operating characteristic curve analysis. The 11 pathologists participating in the simulation exercise rated their diagnostic confidence as definitely normal, probably normal, equivocal, probably tumor, or definitely tumor. Results: The area under the receiver operating characteristic curve (AUC) for the algorithms ranged from 0.556 to 0.994. The top-performing algorithm achieved a lesion-level, true-positive fraction comparable with that of the pathologist WOTC (72.4% [95% CI, 64.3%-80.4%]) at a mean of 0.0125 false-positives per normal whole-slide image. For the whole-slide image classification task, the best algorithm (AUC, 0.994 [95% CI, 0.983-0.999]) performed significantly better than the pathologists WTC in a diagnostic simulation (mean AUC, 0.810 [range, 0.738-0.884]; P < .001). The top 5 algorithms had a mean AUC that was comparable with the pathologist interpreting the slides in the absence of time constraints (mean AUC, 0.960 [range, 0.923-0.994] for the top 5 algorithms vs 0.966 [95% CI, 0.927-0.998] for the pathologist WOTC). Conclusions and Relevance: In the setting of a challenge competition, some deep learning algorithms achieved better diagnostic performance than a panel of 11 pathologists participating in a simulation exercise designed to mimic routine pathology workflow; algorithm performance was comparable with an expert pathologist interpreting whole-slide images without time constraints. Whether this approach has clinical utility will require evaluation in a clinical setting.


Subject(s)
Breast Neoplasms/pathology , Lymphatic Metastasis/diagnosis , Machine Learning , Pathologists , Algorithms , Female , Humans , Lymphatic Metastasis/pathology , Pathology, Clinical , ROC Curve
8.
Cytometry A ; 91(6): 555-565, 2017 06.
Article in English | MEDLINE | ID: mdl-28426134

ABSTRACT

Digital pathology has led to a demand for automated detection of regions of interest, such as cancerous tissue, from scanned whole slide images. With accurate methods using image analysis and machine learning, significant speed-up, and savings in costs through increased throughput in histological assessment could be achieved. This article describes a machine learning approach for detection of cancerous tissue from scanned whole slide images. Our method is based on feature engineering and supervised learning with a random forest model. The features extracted from the whole slide images include several local descriptors related to image texture, spatial structure, and distribution of nuclei. The method was evaluated in breast cancer metastasis detection from lymph node samples. Our results show that the method detects metastatic areas with high accuracy (AUC = 0.97-0.98 for tumor detection within whole image area, AUC = 0.84-0.91 for tumor vs. normal tissue detection) and that the method generalizes well for images from more than one laboratory. Further, the method outputs an interpretable classification model, enabling the linking of individual features to differences between tissue types. © 2017 International Society for Advancement of Cytometry.


Subject(s)
Breast Neoplasms/diagnosis , Histocytochemistry/statistics & numerical data , Image Interpretation, Computer-Assisted/methods , Lymph Nodes/diagnostic imaging , Machine Learning , Adult , Area Under Curve , Breast Neoplasms/pathology , Cell Nucleus/pathology , Cell Nucleus/ultrastructure , Eosine Yellowish-(YS) , Female , Hematoxylin , Humans , Lymph Nodes/pathology , Lymphatic Metastasis , Lymphocytes/pathology , Lymphocytes/ultrastructure , Middle Aged , ROC Curve , Software
9.
Sci Rep ; 7: 44831, 2017 03 20.
Article in English | MEDLINE | ID: mdl-28317907

ABSTRACT

Cancer involves histological changes in tissue, which is of primary importance in pathological diagnosis and research. Automated histological analysis requires ability to computationally separate pathological alterations from normal tissue with all its variables. On the other hand, understanding connections between genetic alterations and histological attributes requires development of enhanced analysis methods suitable also for small sample sizes. Here, we set out to develop computational methods for early detection and distinction of prostate cancer-related pathological alterations. We use analysis of features from HE stained histological images of normal mouse prostate epithelium, distinguishing the descriptors for variability between ventral, lateral, and dorsal lobes. In addition, we use two common prostate cancer models, Hi-Myc and Pten+/- mice, to build a feature-based machine learning model separating the early pathological lesions provoked by these genetic alterations. This work offers a set of computational methods for separation of early neoplastic lesions in the prostates of model mice, and provides proof-of-principle for linking specific tumor genotypes to quantitative histological characteristics. The results obtained show that separation between different spatial locations within the organ, as well as classification between histologies linked to different genetic backgrounds, can be performed with very high specificity and sensitivity.


Subject(s)
Epithelium/pathology , Precancerous Conditions/pathology , Prostatic Neoplasms/pathology , Animals , Biomarkers , Biopsy , Computational Biology/methods , Disease Models, Animal , Epithelium/metabolism , Gene Expression Profiling , Genetic Heterogeneity , Genotype , Humans , Immunohistochemistry , Male , Mice , Neoplasm Grading , Neoplasm Staging , PTEN Phosphohydrolase/genetics , PTEN Phosphohydrolase/metabolism , Precancerous Conditions/genetics , Precancerous Conditions/metabolism , Prostatic Neoplasms/genetics , Prostatic Neoplasms/metabolism , ROC Curve , Transcriptome
10.
J Pathol Inform ; 7: 5, 2016.
Article in English | MEDLINE | ID: mdl-26955503

ABSTRACT

This paper describes work presented at the Nordic Symposium on Digital Pathology 2015, in Linköping, Sweden. Prostatic intraepithelial neoplasia (PIN) represents premalignant tissue involving epithelial growth confined in the lumen of prostatic acini. In the attempts to understand oncogenesis in the human prostate, early neoplastic changes can be modeled in the mouse with genetic manipulation of certain tumor suppressor genes or oncogenes. As with many early pathological changes, the PIN lesions in the mouse prostate are macroscopically small, but microscopically spanning areas often larger than single high magnification focus fields in microscopy. This poses a challenge to utilize full potential of the data acquired in histological specimens. We use whole prostates fixed in molecular fixative PAXgene™, embedded in paraffin, sectioned through and stained with H&E. To visualize and analyze the microscopic information spanning whole mouse PIN (mPIN) lesions, we utilize automated whole slide scanning and stacked sections through the tissue. The region of interests is masked, and the masked areas are processed using a cascade of automated image analysis steps. The images are normalized in color space, after which exclusion of secretion areas and feature extraction is performed. Machine learning is utilized to build a model of early PIN lesions for determining the probability for histological changes based on the calculated features. We performed a feature-based analysis to mPIN lesions. First, a quantitative representation of over 100 features was built, including several features representing pathological changes in PIN, especially describing the spatial growth pattern of lesions in the prostate tissue. Furthermore, we built a classification model, which is able to align PIN lesions corresponding to grading by visual inspection to more advanced and mild lesions. The classifier allowed both determining the probability of early histological changes for uncategorized tissue samples and interpretation of the model parameters. Here, we develop quantitative image analysis pipeline to describe morphological changes in histological images. Even subtle changes in mPIN lesion characteristics can be described with feature analysis and machine learning. Constructing and using multidimensional feature data to represent histological changes enables richer analysis and interpretation of early pathological lesions.

SELECTION OF CITATIONS
SEARCH DETAIL
...