Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Artigo em Inglês | MEDLINE | ID: mdl-38526881

RESUMO

Accurately diagnosing chronic kidney disease requires pathologists to assess the structure of multiple tissues under different stains, a process that is timeconsuming and labor-intensive. Current AI-based methods for automatic structure assessment, like segmentation, often demand extensive manual annotation and focus on single stain domain. To address these challenges, we introduce MSMTSeg, a generative self-supervised meta-learning framework for multi-stained multi-tissue segmentation in renal biopsy whole slide images (WSIs). MSMTSeg incorporates multiple stain transform models for style translation of inter-stain domains, a self-supervision module for obtaining pre-trained models with the domain-specific feature representation, and a meta-learning strategy that leverages generated virtual data and pre-trained models to learn the domain-invariant feature representation across multiple stains, thereby enhancing segmentation performance. Experimental results demonstrate that MSMTSeg achieves superior and robust performance, with mDSC of 0.836 and mIoU of 0.718 for multiple tissues under different stains, using only one annotated training sample for each stain. Our ablation study confirms the effectiveness of each component, positioning MSMTSeg ahead of classic advanced segmentation networks, recent few-shot segmentation methods, and unsupervised domain adaptation methods. In conclusion, our proposed few-shot cross-domain technology offers a feasible and cost-effective solution for multi-stained renal histology segmentation, providing convenient assistance to pathologists in clinical practice. The source code and conditionally accessible data are available at https://github.com/SnowRain510/MSMTSeg.

2.
Comput Biol Med ; 166: 107470, 2023 Sep 09.
Artigo em Inglês | MEDLINE | ID: mdl-37722173

RESUMO

Diagnosis of diabetic kidney disease (DKD) mainly relies on screening the morphological variations and internal lesions of glomeruli from pathological kidney biopsy. The prominent pathological alterations of glomeruli for DKD include glomerular hypertrophy and nodular mesangial sclerosis. However, the qualitative judgment of these alterations is inaccurate and inconstant due to the intra- and inter-subject variability of pathologists. It is necessary to design artificial intelligence (AI) methods for accurate quantification of these pathological alterations and outcome prediction of DKD. In this work, we present an AI-driven framework to quantify the volume of glomeruli and degree of nodular mesangial sclerosis, respectively, based on an instance segmentation module and a novel weakly supervised Macro-Micro Aggregation (MMA) module. Subsequently, we construct classic machine learning models to predict the degree of DKD based on three selected pathological indicators via factor analysis. These corresponding modules are trained and tested on a total of 281 whole slide images (WSIs) digitized from two hospitals with different scanners. Our designed AI framework achieved inspiring results with 0.926 mIoU for glomerulus segmentation, and 0.899 F1 score for glomerulus classification in the external testing dataset. Meantime, the visualized results of the MMA module could reflect the location of the lesions. The performance of predicting disease achieved the F1 score of 0.917, which further proved the effectiveness of our AI-driven quantification of pathological indicators. Additionally, the interpretation of the machine learning model with the SHAP method showed similar accordance with the development of DKD in pathology. In conclusion, the proposed auxiliary diagnostic technologies have the feasibility for quantitative analysis of glomerular pathological tissues and alterations in DKD. Pathological quantitative indicators will also make it more convenient to provide doctors with assistance in clinical practice.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...