Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Med Biol Eng Comput ; 61(3): 847-865, 2023 Mar.
Article in English | MEDLINE | ID: mdl-36624356

ABSTRACT

Traumatic brain injury (TBI) engenders traumatic necrosis and penumbra-areas of secondary neural injury which are crucial targets for therapeutic interventions. Segmenting manually areas of ongoing changes like necrosis, edema, hematoma, and inflammation is tedious, error-prone, and biased. Using the multi-parametric MR data from a rodent model study, we demonstrate the effectiveness of an end-end deep learning global-attention-based UNet (GA-UNet) framework for automatic segmentation and quantification of TBI lesions. Longitudinal MR scans (2 h, 1, 3, 7, 14, 30, and 60 days) were performed on eight Sprague-Dawley rats after controlled cortical injury was performed. TBI lesion and sub-regions segmentation was performed using 3D-UNet and GA-UNet. Dice statistics (DSI) and Hausdorff distance were calculated to assess the performance. MR scan variations-based (bias, noise, blur, ghosting) data augmentation was performed to develop a robust model.Training/validation median DSI for U-Net was 0.9368 with T2w and MPRAGE inputs, whereas GA-UNet had 0.9537 for the same. Testing accuracies were higher for GA-UNet than U-Net with a DSI of 0.8232 for the T2w-MPRAGE inputs.Longitudinally, necrosis remained constant while oligemia and penumbra decreased, and edema appearing around day 3 which increased with time. GA-UNet shows promise for multi-contrast MR image-based segmentation/quantification of TBI in large cohort studies.


Subject(s)
Brain Injuries, Traumatic , Deep Learning , Rats , Animals , Rats, Sprague-Dawley , Magnetic Resonance Imaging , Cohort Studies , Brain Injuries, Traumatic/diagnostic imaging , Image Processing, Computer-Assisted
2.
Stem Cell Res Ther ; 10(1): 38, 2019 01 22.
Article in English | MEDLINE | ID: mdl-30670100

ABSTRACT

Adipogenesis is essential in in vitro experimentation to assess differentiation capability of stem cells, and therefore, its accurate measurement is important. Quantitative analysis of adipogenic levels, however, is challenging and often susceptible to errors due to non-specific reading or manual estimation by observers. To this end, we developed a novel adipocyte quantification algorithm, named Fast Adipogenesis Tracking System (FATS), based on computer vision libraries. The FATS algorithm is versatile and capable of accurately detecting and quantifying percentage of cells undergoing adipogenic and browning differentiation even under difficult conditions such as the presence of large cell clumps or high cell densities. The algorithm was tested on various cell lines including 3T3-L1 cells, adipose-derived mesenchymal stem cells (ASCs), and induced pluripotent stem cell (iPSC)-derived cells. The FATS algorithm is particularly useful for adipogenic measurement of embryoid bodies derived from pluripotent stem cells and was capable of accurately distinguishing adipogenic cells from false-positive stains. We then demonstrate the effectiveness of the FATS algorithm for screening of nuclear receptor ligands that affect adipogenesis in the high-throughput manner. Together, the FATS offer a universal and automated image-based method to quantify adipocyte differentiation of different cell lines in both standard and high-throughput workflows.


Subject(s)
Adipocytes/metabolism , High-Throughput Screening Assays/methods , Adipogenesis , Animals , Humans , Mice
SELECTION OF CITATIONS
SEARCH DETAIL
...