Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Int J Comput Assist Radiol Surg ; 16(5): 819-827, 2021 May.
Article in English | MEDLINE | ID: mdl-33840037

ABSTRACT

PURPOSE: Accurate placement of the needle is critical in interventions like biopsies and regional anesthesia, during which incorrect needle insertion can lead to procedure failure and complications. Therefore, ultrasound guidance is widely used to improve needle placement accuracy. However, at steep and deep insertions, the visibility of the needle is lost. Computational methods for automatic needle tip localization could improve the clinical success rate in these scenarios. METHODS: We propose a novel algorithm for needle tip localization during challenging ultrasound-guided insertions when the shaft may be invisible, and the tip has a low intensity. There are two key steps in our approach. First, we enhance the needle tip features in consecutive ultrasound frames using a detection scheme which recognizes subtle intensity variations caused by needle tip movement. We then employ a hybrid deep neural network comprising a convolutional neural network and long short-term memory recurrent units. The input to the network is a consecutive plurality of fused enhanced frames and the corresponding original B-mode frames, and this spatiotemporal information is used to predict the needle tip location. RESULTS: We evaluate our approach on an ex vivo dataset collected with in-plane and out-of-plane insertion of 17G and 22G needles in bovine, porcine, and chicken tissue, acquired using two different ultrasound systems. We train the model with 5000 frames from 42 video sequences. Evaluation on 600 frames from 30 sequences yields a tip localization error of [Formula: see text] mm and an overall inference time of 0.064 s (15 fps). Comparison against prior art on challenging datasets reveals a 30% improvement in tip localization accuracy. CONCLUSION: The proposed method automatically models temporal dynamics associated with needle tip motion and is more accurate than state-of-the-art methods. Therefore, it has the potential for improving needle tip localization in challenging ultrasound-guided interventions.


Subject(s)
Motion , Neural Networks, Computer , Surgery, Computer-Assisted/methods , Ultrasonography, Interventional/methods , Ultrasonography/methods , Algorithms , Animals , Artifacts , Biopsy , Cattle , Chickens , Needles , Reproducibility of Results , Swine
2.
Int J Comput Assist Radiol Surg ; 14(6): 1017-1026, 2019 Jun.
Article in English | MEDLINE | ID: mdl-30911878

ABSTRACT

PURPOSE: This paper addresses localization of needles inserted both in-plane and out-of-plane in challenging ultrasound-guided interventions where the shaft and tip have low intensity. Our approach combines a novel digital subtraction scheme for enhancement of low-level intensity changes caused by tip movement in the ultrasound image and a state-of-the-art deep learning scheme for tip detection. METHODS: As the needle tip moves through tissue, it causes subtle spatiotemporal variations in intensity. Relying on these intensity changes, we formulate a foreground detection scheme for enhancing the tip from consecutive ultrasound frames. The tip is augmented by solving a spatial total variation regularization problem using the split Bregman method. Lastly, we filter irrelevant motion events with a deep learning-based end-to-end data-driven method that models the appearance of the needle tip in ultrasound images, resulting in needle tip detection. RESULTS: The detection model is trained and evaluated on an extensive ex vivo dataset collected with 17G and 22G needles inserted in-plane and out-of-plane in bovine, porcine and chicken phantoms. We use 5000 images extracted from 20 video sequences for training and 1000 images from 10 sequences for validation. The overall framework is evaluated on 700 images from 20 sequences not used in training and validation, and achieves a tip localization error of 0.72 ± 0.04 mm and an overall processing time of 0.094 s per frame (~ 10 frames per second). CONCLUSION: The proposed method is faster and more accurate than state of the art and is resilient to spatiotemporal redundancies. The promising results demonstrate its potential for accurate needle localization in challenging ultrasound-guided interventions.


Subject(s)
Biopsy/methods , Needles , Ultrasonography, Interventional/methods , Animals , Cattle , Chickens , Motion , Phantoms, Imaging , Swine
3.
Int J Comput Assist Radiol Surg ; 13(5): 647-657, 2018 May.
Article in English | MEDLINE | ID: mdl-29512006

ABSTRACT

PURPOSE: We propose a framework for automatic and accurate detection of steeply inserted needles in 2D ultrasound data using convolution neural networks. We demonstrate its application in needle trajectory estimation and tip localization. METHODS: Our approach consists of a unified network, comprising a fully convolutional network (FCN) and a fast region-based convolutional neural network (R-CNN). The FCN proposes candidate regions, which are then fed to a fast R-CNN for finer needle detection. We leverage a transfer learning paradigm, where the network weights are initialized by training with non-medical images, and fine-tuned with ex vivo ultrasound scans collected during insertion of a 17G epidural needle into freshly excised porcine and bovine tissue at depth settings up to 9 cm and [Formula: see text]-[Formula: see text] insertion angles. Needle detection results are used to accurately estimate needle trajectory from intensity invariant needle features and perform needle tip localization from an intensity search along the needle trajectory. RESULTS: Our needle detection model was trained and validated on 2500 ex vivo ultrasound scans. The detection system has a frame rate of 25 fps on a GPU and achieves 99.6% precision, 99.78% recall rate and an [Formula: see text] score of 0.99. Validation for needle localization was performed on 400 scans collected using a different imaging platform, over a bovine/porcine lumbosacral spine phantom. Shaft localization error of [Formula: see text], tip localization error of [Formula: see text] mm, and a total processing time of 0.58 s were achieved. CONCLUSION: The proposed method is fully automatic and provides robust needle localization results in challenging scanning conditions. The accurate and robust results coupled with real-time detection and sub-second total processing make the proposed method promising in applications for needle detection and localization during challenging minimally invasive ultrasound-guided procedures.


Subject(s)
Anesthesia, Epidural/methods , Needles , Neural Networks, Computer , Phantoms, Imaging , Ultrasonography/methods , Animals , Cattle , Spine , Swine
4.
Int J Comput Assist Radiol Surg ; 13(3): 363-374, 2018 Mar.
Article in English | MEDLINE | ID: mdl-29294213

ABSTRACT

PURPOSE: We propose a novel framework for enhancement and localization of steeply inserted hand-held needles under in-plane 2D ultrasound guidance. METHODS: Depth-dependent attenuation and non-axial specular reflection hinder visibility of steeply inserted needles. Here, we model signal transmission maps representative of the attenuation probability within the image domain. The maps are employed in a contextual regularization framework to recover needle shaft and tip information. The needle tip is automatically localized by line-fitting along the local-phase-directed trajectory, followed by statistical optimization. RESULTS: The proposed method was tested on 300 ex vivo ultrasound scans collected during insertion of an epidural needle into freshly excised porcine and bovine tissue. A tip localization accuracy of [Formula: see text] was achieved. CONCLUSION: The proposed method could be useful in challenging procedures where needle shaft and tip are inconspicuous. Improved needle localization results compared to previously proposed methods suggest that the proposed method is promising for further clinical evaluation.


Subject(s)
Anesthesia, Epidural/instrumentation , Image-Guided Biopsy/instrumentation , Needles , Phantoms, Imaging , Ultrasonography, Interventional/instrumentation , Animals , Cattle , Models, Animal , Swine
SELECTION OF CITATIONS
SEARCH DETAIL
...