Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
1.
Cell ; 187(10): 2502-2520.e17, 2024 May 09.
Article in English | MEDLINE | ID: mdl-38729110

ABSTRACT

Human tissue, which is inherently three-dimensional (3D), is traditionally examined through standard-of-care histopathology as limited two-dimensional (2D) cross-sections that can insufficiently represent the tissue due to sampling bias. To holistically characterize histomorphology, 3D imaging modalities have been developed, but clinical translation is hampered by complex manual evaluation and lack of computational platforms to distill clinical insights from large, high-resolution datasets. We present TriPath, a deep-learning platform for processing tissue volumes and efficiently predicting clinical outcomes based on 3D morphological features. Recurrence risk-stratification models were trained on prostate cancer specimens imaged with open-top light-sheet microscopy or microcomputed tomography. By comprehensively capturing 3D morphologies, 3D volume-based prognostication achieves superior performance to traditional 2D slice-based approaches, including clinical/histopathological baselines from six certified genitourinary pathologists. Incorporating greater tissue volume improves prognostic performance and mitigates risk prediction variability from sampling bias, further emphasizing the value of capturing larger extents of heterogeneous morphology.


Subject(s)
Imaging, Three-Dimensional , Prostatic Neoplasms , Supervised Machine Learning , Humans , Male , Deep Learning , Imaging, Three-Dimensional/methods , Prognosis , Prostatic Neoplasms/pathology , Prostatic Neoplasms/diagnostic imaging , X-Ray Microtomography/methods
2.
J Biomed Opt ; 29(3): 036001, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38434772

ABSTRACT

Significance: In recent years, we and others have developed non-destructive methods to obtain three-dimensional (3D) pathology datasets of clinical biopsies and surgical specimens. For prostate cancer risk stratification (prognostication), standard-of-care Gleason grading is based on examining the morphology of prostate glands in thin 2D sections. This motivates us to perform 3D segmentation of prostate glands in our 3D pathology datasets for the purposes of computational analysis of 3D glandular features that could offer improved prognostic performance. Aim: To facilitate prostate cancer risk assessment, we developed a computationally efficient and accurate deep learning model for 3D gland segmentation based on open-top light-sheet microscopy datasets of human prostate biopsies stained with a fluorescent analog of hematoxylin and eosin (H&E). Approach: For 3D gland segmentation based on our H&E-analog 3D pathology datasets, we previously developed a hybrid deep learning and computer vision-based pipeline, called image translation-assisted segmentation in 3D (ITAS3D), which required a complex two-stage procedure and tedious manual optimization of parameters. To simplify this procedure, we use the 3D gland-segmentation masks previously generated by ITAS3D as training datasets for a direct end-to-end deep learning-based segmentation model, nnU-Net. The inputs to this model are 3D pathology datasets of prostate biopsies rapidly stained with an inexpensive fluorescent analog of H&E and the outputs are 3D semantic segmentation masks of the gland epithelium, gland lumen, and surrounding stromal compartments within the tissue. Results: nnU-Net demonstrates remarkable accuracy in 3D gland segmentations even with limited training data. Moreover, compared with the previous ITAS3D pipeline, nnU-Net operation is simpler and faster, and it can maintain good accuracy even with lower-resolution inputs. Conclusions: Our trained DL-based 3D segmentation model will facilitate future studies to demonstrate the value of computational 3D pathology for guiding critical treatment decisions for patients with prostate cancer.


Subject(s)
Prostate , Prostatic Neoplasms , Male , Humans , Prostate/diagnostic imaging , Prostatic Neoplasms/diagnostic imaging , Biopsy , Coloring Agents , Eosine Yellowish-(YS)
3.
Nat Protoc ; 19(4): 1122-1148, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38263522

ABSTRACT

Recent advances in 3D pathology offer the ability to image orders of magnitude more tissue than conventional pathology methods while also providing a volumetric context that is not achievable with 2D tissue sections, and all without requiring destructive tissue sectioning. Generating high-quality 3D pathology datasets on a consistent basis, however, is not trivial and requires careful attention to a series of details during tissue preparation, imaging and initial data processing, as well as iterative optimization of the entire process. Here, we provide an end-to-end procedure covering all aspects of a 3D pathology workflow (using light-sheet microscopy as an illustrative imaging platform) with sufficient detail to perform well-controlled preclinical and clinical studies. Although 3D pathology is compatible with diverse staining protocols and computationally generated color palettes for visual analysis, this protocol focuses on the use of a fluorescent analog of hematoxylin and eosin, which remains the most common stain used for gold-standard pathological reports. We present our guidelines for a broad range of end users (e.g., biologists, clinical researchers and engineers) in a simple format. The end-to-end workflow requires 3-6 d to complete, bearing in mind that data analysis may take longer.


Subject(s)
Imaging, Three-Dimensional , Microscopy , Imaging, Three-Dimensional/methods , Workflow , Microscopy/methods , Coloring Agents , Staining and Labeling
4.
Lab Invest ; 103(12): 100265, 2023 12.
Article in English | MEDLINE | ID: mdl-37858679

ABSTRACT

Prostate cancer prognostication largely relies on visual assessment of a few thinly sectioned biopsy specimens under a microscope to assign a Gleason grade group (GG). Unfortunately, the assigned GG is not always associated with a patient's outcome in part because of the limited sampling of spatially heterogeneous tumors achieved by 2-dimensional histopathology. In this study, open-top light-sheet microscopy was used to obtain 3-dimensional pathology data sets that were assessed by 4 human readers. Intrabiopsy variability was assessed by asking readers to perform Gleason grading of 5 different levels per biopsy for a total of 20 core needle biopsies (ie, 100 total images). Intrabiopsy variability (Cohen κ) was calculated as the worst pairwise agreement in GG between individual levels within each biopsy and found to be 0.34, 0.34, 0.38, and 0.43 for the 4 pathologists. These preliminary results reveal that even within a 1-mm-diameter needle core, GG based on 2-dimensional images can vary dramatically depending on the location within a biopsy being analyzed. We believe that morphologic assessment of whole biopsies in 3 dimension has the potential to enable more reliable and consistent tumor grading.


Subject(s)
Prostate , Prostatic Neoplasms , Male , Humans , Prostate/pathology , Biopsy , Prostatic Neoplasms/pathology , Biopsy, Large-Core Needle , Neoplasm Grading
5.
ArXiv ; 2023 Jul 27.
Article in English | MEDLINE | ID: mdl-37547660

ABSTRACT

Human tissue consists of complex structures that display a diversity of morphologies, forming a tissue microenvironment that is, by nature, three-dimensional (3D). However, the current standard-of-care involves slicing 3D tissue specimens into two-dimensional (2D) sections and selecting a few for microscopic evaluation1,2, with concomitant risks of sampling bias and misdiagnosis3-6. To this end, there have been intense efforts to capture 3D tissue morphology and transition to 3D pathology, with the development of multiple high-resolution 3D imaging modalities7-18. However, these tools have had little translation to clinical practice as manual evaluation of such large data by pathologists is impractical and there is a lack of computational platforms that can efficiently process the 3D images and provide patient-level clinical insights. Here we present Modality-Agnostic Multiple instance learning for volumetric Block Analysis (MAMBA), a deep-learning-based platform for processing 3D tissue images from diverse imaging modalities and predicting patient outcomes. Archived prostate cancer specimens were imaged with open-top light-sheet microscopy12-14 or microcomputed tomography15,16 and the resulting 3D datasets were used to train risk-stratification networks based on 5-year biochemical recurrence outcomes via MAMBA. With the 3D block-based approach, MAMBA achieves an area under the receiver operating characteristic curve (AUC) of 0.86 and 0.74, superior to 2D traditional single-slice-based prognostication (AUC of 0.79 and 0.57), suggesting superior prognostication with 3D morphological features. Further analyses reveal that the incorporation of greater tissue volume improves prognostic performance and mitigates risk prediction variability from sampling bias, suggesting that there is value in capturing larger extents of spatially heterogeneous 3D morphology. With the rapid growth and adoption of 3D spatial biology and pathology techniques by researchers and clinicians, MAMBA provides a general and efficient framework for 3D weakly supervised learning for clinical decision support and can help to reveal novel 3D morphological biomarkers for prognosis and therapeutic response.

6.
bioRxiv ; 2023 Aug 06.
Article in English | MEDLINE | ID: mdl-37577615

ABSTRACT

Recent advances in 3D pathology offer the ability to image orders-of-magnitude more tissue than conventional pathology while providing a volumetric context that is lacking with 2D tissue sections, all without requiring destructive tissue sectioning. Generating high-quality 3D pathology datasets on a consistent basis is non-trivial, requiring careful attention to many details regarding tissue preparation, imaging, and data/image processing in an iterative process. Here we provide an end-to-end protocol covering all aspects of a 3D pathology workflow (using light-sheet microscopy as an illustrative imaging platform) with sufficient detail to perform well-controlled preclinical and clinical studies. While 3D pathology is compatible with diverse staining protocols and computationally generated color palettes for visual analysis, this protocol will focus on a fluorescent analog of hematoxylin and eosin (H&E), which remains the most common stain for gold-standard diagnostic determinations. We present our guidelines for a broad range of end-users (e.g., biologists, clinical researchers, and engineers) in a simple tutorial format.

7.
J Pathol ; 260(4): 390-401, 2023 08.
Article in English | MEDLINE | ID: mdl-37232213

ABSTRACT

Prostate cancer treatment decisions rely heavily on subjective visual interpretation [assigning Gleason patterns or International Society of Urological Pathology (ISUP) grade groups] of limited numbers of two-dimensional (2D) histology sections. Under this paradigm, interobserver variance is high, with ISUP grades not correlating well with outcome for individual patients, and this contributes to the over- and undertreatment of patients. Recent studies have demonstrated improved prognostication of prostate cancer outcomes based on computational analyses of glands and nuclei within 2D whole slide images. Our group has also shown that the computational analysis of three-dimensional (3D) glandular features, extracted from 3D pathology datasets of whole intact biopsies, can allow for improved recurrence prediction compared to corresponding 2D features. Here we seek to expand on these prior studies by exploring the prognostic value of 3D shape-based nuclear features in prostate cancer (e.g. nuclear size, sphericity). 3D pathology datasets were generated using open-top light-sheet (OTLS) microscopy of 102 cancer-containing biopsies extracted ex vivo from the prostatectomy specimens of 46 patients. A deep learning-based workflow was developed for 3D nuclear segmentation within the glandular epithelium versus stromal regions of the biopsies. 3D shape-based nuclear features were extracted, and a nested cross-validation scheme was used to train a supervised machine classifier based on 5-year biochemical recurrence (BCR) outcomes. Nuclear features of the glandular epithelium were found to be more prognostic than stromal cell nuclear features (area under the ROC curve [AUC] = 0.72 versus 0.63). 3D shape-based nuclear features of the glandular epithelium were also more strongly associated with the risk of BCR than analogous 2D features (AUC = 0.72 versus 0.62). The results of this preliminary investigation suggest that 3D shape-based nuclear features are associated with prostate cancer aggressiveness and could be of value for the development of decision-support tools. © 2023 The Pathological Society of Great Britain and Ireland.


Subject(s)
Prostate , Prostatic Neoplasms , Male , Humans , Neoplasm Grading , Prostate/pathology , Prostatic Neoplasms/pathology , Prognosis , Prostatectomy/methods , Risk Assessment
8.
IEEE Trans Biomed Eng ; 70(7): 2160-2171, 2023 07.
Article in English | MEDLINE | ID: mdl-37021859

ABSTRACT

OBJECTIVE: For tumor resections, margin status typically correlates with patient survival but positive margin rates are generally high (up to 45% for head and neck cancer). Frozen section analysis (FSA) is often used to intraoperatively assess the margins of excised tissue, but suffers from severe under-sampling of the actual margin surface, inferior image quality, slow turnaround, and tissue destructiveness. METHODS: Here, we have developed an imaging workflow to generate en face histologic images of freshly excised surgical margin surfaces based on open-top light-sheet (OTLS) microscopy. Key innovations include (1) the ability to generate false-colored H&E-mimicking images of tissue surfaces stained for < 1 min with a single fluorophore, (2) rapid OTLS surface imaging at a rate of 15 min/cm2 followed by real-time post-processing of datasets within RAM at a rate of 5 min/cm2, and (3) rapid digital surface extraction to account for topological irregularities at the tissue surface. RESULTS: In addition to the performance metrics listed above, we show that the image quality generated by our rapid surface-histology method approaches that of gold-standard archival histology. CONCLUSION: OTLS microscopy has the feasibility to provide intraoperative guidance of surgical oncology procedures. SIGNIFICANCE: The reported methods can potentially improve tumor-resection procedures, thereby improving patient outcomes and quality of life.


Subject(s)
Margins of Excision , Microscopy , Humans , Quality of Life , Histological Techniques
9.
Nat Methods ; 19(5): 613-619, 2022 05.
Article in English | MEDLINE | ID: mdl-35545715

ABSTRACT

Light-sheet microscopy has emerged as the preferred means for high-throughput volumetric imaging of cleared tissues. However, there is a need for a flexible system that can address imaging applications with varied requirements in terms of resolution, sample size, tissue-clearing protocol, and transparent sample-holder material. Here, we present a 'hybrid' system that combines a unique non-orthogonal dual-objective and conventional (orthogonal) open-top light-sheet (OTLS) architecture for versatile multi-scale volumetric imaging. We demonstrate efficient screening and targeted sub-micrometer imaging of sparse axons within an intact, cleared mouse brain. The same system enables high-throughput automated imaging of multiple specimens, as spotlighted by a quantitative multi-scale analysis of brain metastases. Compared with existing academic and commercial light-sheet microscopy systems, our hybrid OTLS system provides a unique combination of versatility and performance necessary to satisfy the diverse requirements of a growing number of cleared-tissue imaging applications.


Subject(s)
Microscopy , Animals , Mice , Microscopy/methods
10.
Cancer Res ; 82(2): 334-345, 2022 01 15.
Article in English | MEDLINE | ID: mdl-34853071

ABSTRACT

Prostate cancer treatment planning is largely dependent upon examination of core-needle biopsies. The microscopic architecture of the prostate glands forms the basis for prognostic grading by pathologists. Interpretation of these convoluted three-dimensional (3D) glandular structures via visual inspection of a limited number of two-dimensional (2D) histology sections is often unreliable, which contributes to the under- and overtreatment of patients. To improve risk assessment and treatment decisions, we have developed a workflow for nondestructive 3D pathology and computational analysis of whole prostate biopsies labeled with a rapid and inexpensive fluorescent analogue of standard hematoxylin and eosin (H&E) staining. This analysis is based on interpretable glandular features and is facilitated by the development of image translation-assisted segmentation in 3D (ITAS3D). ITAS3D is a generalizable deep learning-based strategy that enables tissue microstructures to be volumetrically segmented in an annotation-free and objective (biomarker-based) manner without requiring immunolabeling. As a preliminary demonstration of the translational value of a computational 3D versus a computational 2D pathology approach, we imaged 300 ex vivo biopsies extracted from 50 archived radical prostatectomy specimens, of which, 118 biopsies contained cancer. The 3D glandular features in cancer biopsies were superior to corresponding 2D features for risk stratification of patients with low- to intermediate-risk prostate cancer based on their clinical biochemical recurrence outcomes. The results of this study support the use of computational 3D pathology for guiding the clinical management of prostate cancer. SIGNIFICANCE: An end-to-end pipeline for deep learning-assisted computational 3D histology analysis of whole prostate biopsies shows that nondestructive 3D pathology has the potential to enable superior prognostic stratification of patients with prostate cancer.


Subject(s)
Deep Learning , Imaging, Three-Dimensional/methods , Prostate/diagnostic imaging , Prostatic Neoplasms/diagnostic imaging , Prostatic Neoplasms/epidemiology , Aged , Biopsy, Large-Core Needle , Cohort Studies , Humans , Male , Middle Aged , Prostate/pathology , Prostatectomy , Prostatic Neoplasms/pathology , Prostatic Neoplasms/surgery , Risk Assessment , Staining and Labeling
11.
Opt Express ; 29(15): 24349-24362, 2021 Jul 19.
Article in English | MEDLINE | ID: mdl-34614682

ABSTRACT

Fluorescence microscopy benefits from spatially and temporally homogeneous illumination with the illumination area matched to the shape and size of the camera sensor. Fiber-coupled illumination schemes have the added benefit of straightforward and robust alignment and ease of installation compared to free-space coupled illumination. Commercial and open-source fiber-coupled, homogenized illumination schemes have recently become available to the public; however, there have been no published comparisons of speckle reduction schemes to date. We characterize three different multimode fibers in combination with two laser speckle reduction devices and compare spatial and temporal profiles to a commercial unit. This work yields a new design, the EvenField Illuminator, which is freely available for researchers to integrate into their own imaging systems.

12.
PLoS One ; 15(10): e0233198, 2020.
Article in English | MEDLINE | ID: mdl-33001995

ABSTRACT

Slide-free digital pathology techniques, including nondestructive 3D microscopy, are gaining interest as alternatives to traditional slide-based histology. In order to facilitate clinical adoption of these fluorescence-based techniques, software methods have been developed to convert grayscale fluorescence images into color images that mimic the appearance of standard absorptive chromogens such as hematoxylin and eosin (H&E). However, these false-coloring algorithms often require manual and iterative adjustment of parameters, with results that can be inconsistent in the presence of intensity nonuniformities within an image and/or between specimens (intra- and inter-specimen variability). Here, we present an open-source (Python-based) rapid intensity-leveling and digital-staining package that is specifically designed to render two-channel fluorescence images (i.e. a fluorescent analog of H&E) to the traditional H&E color space for 2D and 3D microscopy datasets. However, this method can be easily tailored for other false-coloring needs. Our package offers (1) automated and uniform false coloring in spite of uneven staining within a large thick specimen, (2) consistent color-space representations that are robust to variations in staining and imaging conditions between different specimens, and (3) GPU-accelerated data processing to allow these methods to scale to large datasets. We demonstrate this platform by generating H&E-like images from cleared tissues that are fluorescently imaged in 3D with open-top light-sheet (OTLS) microscopy, and quantitatively characterizing the results in comparison to traditional slide-based H&E histology.


Subject(s)
Imaging, Three-Dimensional/methods , Microscopy, Fluorescence/methods , Coloring Agents , Humans , Pathology/methods , Software , Staining and Labeling
SELECTION OF CITATIONS
SEARCH DETAIL
...