Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Int J Comput Assist Radiol Surg ; 14(9): 1611-1617, 2019 Sep.
Article in English | MEDLINE | ID: mdl-31363983

ABSTRACT

PURPOSE: Manual feedback from senior surgeons observing less experienced trainees is a laborious task that is very expensive, time-consuming and prone to subjectivity. With the number of surgical procedures increasing annually, there is an unprecedented need to provide an accurate, objective and automatic evaluation of trainees' surgical skills in order to improve surgical practice. METHODS: In this paper, we designed a convolutional neural network (CNN) to classify surgical skills by extracting latent patterns in the trainees' motions performed during robotic surgery. The method is validated on the JIGSAWS dataset for two surgical skills evaluation tasks: classification and regression. RESULTS: Our results show that deep neural networks constitute robust machine learning models that are able to reach new competitive state-of-the-art performance on the JIGSAWS dataset. While we leveraged from CNNs' efficiency, we were able to minimize its black-box effect using the class activation map technique. CONCLUSIONS: This characteristic allowed our method to automatically pinpoint which parts of the surgery influenced the skill evaluation the most, thus allowing us to explain a surgical skill classification and provide surgeons with a novel personalized feedback technique. We believe this type of interpretable machine learning model could integrate within "Operation Room 2.0" and support novice surgeons in improving their skills to eventually become experts.


Subject(s)
Clinical Competence , Feedback , General Surgery/education , General Surgery/instrumentation , Machine Learning , Neural Networks, Computer , Biomechanical Phenomena , Cluster Analysis , Humans , Markov Chains , Models, Statistical , Motion , Regression Analysis , Robotic Surgical Procedures , Surgeons
2.
Artif Intell Med ; 91: 3-11, 2018 09.
Article in English | MEDLINE | ID: mdl-30172445

ABSTRACT

OBJECTIVE: The analysis of surgical motion has received a growing interest with the development of devices allowing their automatic capture. In this context, the use of advanced surgical training systems makes an automated assessment of surgical trainee possible. Automatic and quantitative evaluation of surgical skills is a very important step in improving surgical patient care. MATERIAL AND METHOD: In this paper, we present an approach for the discovery and ranking of discriminative and interpretable patterns of surgical practice from recordings of surgical motions. A pattern is defined as a series of actions or events in the kinematic data that together are distinctive of a specific gesture or skill level. Our approach is based on the decomposition of continuous kinematic data into a set of overlapping gestures represented by strings (bag of words) for which we compute comparative numerical statistic (tf-idf) enabling the discriminative gesture discovery via its relative occurrence frequency. RESULTS: We carried out experiments on three surgical motion datasets. The results show that the patterns identified by the proposed method can be used to accurately classify individual gestures, skill levels and surgical interfaces. We also present how the patterns provide a detailed feedback on the trainee skill assessment. CONCLUSIONS: The proposed approach is an interesting addition to existing learning tools for surgery as it provides a way to obtain a feedback on which parts of an exercise have been used to classify the attempt as correct or incorrect.


Subject(s)
Gestures , Pattern Recognition, Automated/methods , Surgical Procedures, Operative/education , Algorithms , Biomechanical Phenomena , Clinical Competence , Formative Feedback , Humans , Task Performance and Analysis , Time and Motion Studies
SELECTION OF CITATIONS
SEARCH DETAIL
...