Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
J Surg Res ; 296: 325-336, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38306938

RESUMO

INTRODUCTION: Minimally Invasive Surgery uses electrosurgical tools that generate smoke. This smoke reduces the visibility of the surgical site and spreads harmful substances with potential hazards for the surgical staff. Automatic image analysis may provide assistance. However, the existing studies are restricted to simple clear versus smoky image classification. MATERIALS AND METHODS: We propose a novel approach using surgical image analysis with machine learning, including deep neural networks. We address three tasks: 1) smoke quantification, which estimates the visual level of smoke, 2) smoke evacuation confidence, which estimates the level of confidence to evacuate smoke, and 3) smoke evacuation recommendation, which estimates the evacuation decision. We collected three datasets with expert annotations. We trained end-to-end neural networks for the three tasks. We also created indirect predictors using task 1 followed by linear regression to solve task 2 and using task 2 followed by binary classification to solve task 3. RESULTS: We observe a reasonable inter-expert variability for tasks 1 and a large one for tasks 2 and 3. For task 1, the expert error is 17.61 percentage points (pp) and the neural network error is 18.45 pp. For tasks 2, the best results are obtained from the indirect predictor based on task 1. For this task, the expert error is 27.35 pp and the predictor error is 23.60 pp. For task 3, the expert accuracy is 76.78% and the predictor accuracy is 81.30%. CONCLUSIONS: Smoke quantification, evacuation confidence, and evaluation recommendation can be achieved by automatic surgical image analysis with similar or better accuracy as the experts.


Assuntos
Processamento de Imagem Assistida por Computador , Procedimentos Cirúrgicos Minimamente Invasivos , Fumaça , Humanos , Aprendizado de Máquina , Redes Neurais de Computação , Nicotiana , Fumaça/análise
2.
J Med Signals Sens ; 13(2): 73-83, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37448539

RESUMO

Background and Objective: The endoscopic diagnosis of pathological changes in the gastroesophageal junction including esophagitis and Barrett's mucosa is based on the visual detection of two boundaries: mucosal color change between esophagus and stomach, and top endpoint of gastric folds. The presence and pattern of mucosal breaks in the gastroesophageal mucosal junction (Z line) classify esophagitis in patients and the distance between the two boundaries points to the possible columnar lined epithelium. Since visual detection may suffer from intra- and interobserver variability, our objective was to define the boundaries automatically based on image processing algorithms, which may enable us to measure the detentions of changes in future studies. Methods: To demarcate the Z-line, first the artifacts of endoscopy images are eliminated. In the second step, using SUSAN edge detector, Mahalanobis distance criteria, and Gabor filter bank, an initial contour is estimated for the Z-line. Using region-based active contours, this initial contour converges to the Z-line. Finally, by applying morphological operators and Gabor Filter Bank to the region inside of the Z-line, gastric folds are segmented. Results: To evaluate the results, a database consisting of 50 images and their ground truths were collected. The average dice coefficient and mean square error of Z-line segmentation were 0.93 and 3.3, respectively. Furthermore, the average boundary distance criteria are 12.3 pixels. In addition, two other criteria that compare the segmentation of folds with several ground truths, i.e., Sweet-Spot Coverage and Jaccard Index for Golden Standard, are 0.90 and 0.84, respectively. Conclusions: Considering the results, automatic segmentation of Z-line and gastric folds are matched to the ground truths with appropriate accuracy.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...