Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters











Database
Language
Publication year range
1.
Sensors (Basel) ; 23(4)2023 Feb 15.
Article in English | MEDLINE | ID: mdl-36850766

ABSTRACT

Medical ultrasound (US) is a commonly used modality for image-guided procedures. Recent research systems providing an in situ visualization of 2D US images via an augmented reality (AR) head-mounted display (HMD) were shown to be advantageous over conventional imaging through reduced task completion times and improved accuracy. In this work, we continue in the direction of recent developments by describing the first AR HMD application visualizing real-time volumetric (3D) US in situ for guiding vascular punctures. We evaluated the application on a technical level as well as in a mixed-methods user study with a qualitative prestudy and a quantitative main study, simulating a vascular puncture. Participants completed the puncture task significantly faster when using 3D US AR mode compared to 2D US AR, with a decrease of 28.4% in time. However, no significant differences were observed regarding the success rate of vascular puncture (2D US AR-50% vs. 3D US AR-72%). On the technical side, the system offers a low latency of 49.90 ± 12.92 ms and a satisfactory frame rate of 60 Hz. Our work shows the feasibility of a system that visualizes real-time 3D US data via an AR HMD, and our experiments show, furthermore, that this may offer additional benefits in US-guided tasks (i.e., reduced task completion time) over 2D US images viewed in AR by offering a vividly volumetric visualization.


Subject(s)
Augmented Reality , Smart Glasses , Humans , Punctures , Ultrasonography
2.
Int J Comput Assist Radiol Surg ; 17(11): 2081-2091, 2022 Nov.
Article in English | MEDLINE | ID: mdl-35776399

ABSTRACT

PURPOSE: Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy. METHODS: The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses-thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images. RESULTS: Tracking is performed with a median accuracy of 1.98 mm/1.81[Formula: see text] for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70[Formula: see text]. The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms. CONCLUSIONS: In this work, we showed that spatial tracking of retroreflective spheres with the depth camera of HoloLens 2 is feasible, achieving a promising accuracy for in situ visualization of live US images. For tracking, no additional hardware nor modifications to HoloLens 2 are required making it a cheap and easy-to-use approach. Moreover, a minimal latency of displayed images enables a real-time perception for the sonographer.


Subject(s)
Cross-Sectional Studies , Humans , Ultrasonography
3.
Sensors (Basel) ; 22(13)2022 Jun 29.
Article in English | MEDLINE | ID: mdl-35808407

ABSTRACT

This work analyzed the use of Microsoft HoloLens 2 in orthopedic oncological surgeries and compares it to its predecessor (Microsoft HoloLens 1). Specifically, we developed two equivalent applications, one for each device, and evaluated the augmented reality (AR) projection accuracy in an experimental scenario using phantoms based on two patients. We achieved automatic registration between virtual and real worlds using patient-specific surgical guides on each phantom. They contained a small adaptor for a 3D-printed AR marker, the characteristic patterns of which were easily recognized using both Microsoft HoloLens devices. The newest model improved the AR projection accuracy by almost 25%, and both of them yielded an RMSE below 3 mm. After ascertaining the enhancement of the second model in this aspect, we went a step further with Microsoft HoloLens 2 and tested it during the surgical intervention of one of the patients. During this experience, we collected the surgeons' feedback in terms of comfortability, usability, and ergonomics. Our goal was to estimate whether the improved technical features of the newest model facilitate its implementation in actual surgical scenarios. All of the results point to Microsoft HoloLens 2 being better in all the aspects affecting surgical interventions and support its use in future experiences.


Subject(s)
Augmented Reality , Orthopedic Procedures , Surgery, Computer-Assisted , Ergonomics , Humans , Phantoms, Imaging , Software , Surgery, Computer-Assisted/methods
4.
Curr Robot Rep ; 2(1): 55-71, 2021.
Article in English | MEDLINE | ID: mdl-34977593

ABSTRACT

PURPOSE OF REVIEW: This review provides an overview of the most recent robotic ultrasound systems that have contemporary emerged over the past five years, highlighting their status and future directions. The systems are categorized based on their level of robot autonomy (LORA). RECENT FINDINGS: Teleoperating systems show the highest level of technical maturity. Collaborative assisting and autonomous systems are still in the research phase, with a focus on ultrasound image processing and force adaptation strategies. However, missing key factors are clinical studies and appropriate safety strategies. Future research will likely focus on artificial intelligence and virtual/augmented reality to improve image understanding and ergonomics. SUMMARY: A review on robotic ultrasound systems is presented in which first technical specifications are outlined. Hereafter, the literature of the past five years is subdivided into teleoperation, collaborative assistance, or autonomous systems based on LORA. Finally, future trends for robotic ultrasound systems are reviewed with a focus on artificial intelligence and virtual/augmented reality.

5.
Int J Comput Assist Radiol Surg ; 15(6): 1033-1042, 2020 Jun.
Article in English | MEDLINE | ID: mdl-32383105

ABSTRACT

PURPOSE: During endovascular aneurysm repair (EVAR) procedures, medical instruments are guided with two-dimensional (2D) fluoroscopy and conventional digital subtraction angiography. However, this requires X-ray exposure and contrast agent is used, and the depth information is missing. To overcome these drawbacks, a three-dimensional (3D) guidance approach based on tracking systems is introduced and evaluated. METHODS: A multicore fiber with fiber Bragg gratings for shape sensing and three electromagnetic (EM) sensors for locating the shape were integrated into a stentgraft system. A model for obtaining the located shape of the first 38 cm of the stentgraft system with two EM sensors is introduced and compared with a method based on three EM sensors. Both methods were evaluated with a vessel phantom containing a 3D-printed vessel made of silicone and agar-agar simulating the surrounding tissue. RESULTS: The evaluation of the guidance methods resulted in average errors from 1.35 to 2.43 mm and maximum errors from 3.04 to 6.30 mm using three EM sensors, and average errors from 1.57 to 2.64 mm and maximum errors from 2.79 to 6.27 mm using two EM sensors. Moreover, the videos made from the continuous measurements showed that a real-time guidance is possible with both approaches. CONCLUSION: The results showed that an accurate real-time guidance with two and three EM sensors is possible and that two EM sensors are already sufficient. Thus, the introduced 3D guidance method is promising to use it as navigation tool in EVAR procedures. Future work will focus on developing a method with less EM sensors and a detailed latency evaluation of the guidance method.


Subject(s)
Aortic Aneurysm/surgery , Blood Vessel Prosthesis Implantation/instrumentation , Imaging, Three-Dimensional/methods , Angiography, Digital Subtraction , Endovascular Procedures/methods , Fluoroscopy , Humans , Phantoms, Imaging
6.
Innov Surg Sci ; 3(3): 167-177, 2018 Sep.
Article in English | MEDLINE | ID: mdl-31579781

ABSTRACT

INTRODUCTION: Endovascular aortic repair (EVAR) is a minimal-invasive technique that prevents life-threatening rupture in patients with aortic pathologies by implantation of an endoluminal stent graft. During the endovascular procedure, device navigation is currently performed by fluoroscopy in combination with digital subtraction angiography. This study presents the current iterative process of biomedical engineering within the disruptive interdisciplinary project Nav EVAR, which includes advanced navigation, image techniques and augmented reality with the aim of reducing side effects (namely radiation exposure and contrast agent administration) and optimising visualisation during EVAR procedures. This article describes the current prototype developed in this project and the experiments conducted to evaluate it. METHODS: The current approach of the Nav EVAR project is guiding EVAR interventions in real-time with an electromagnetic tracking system after attaching a sensor on the catheter tip and displaying this information on Microsoft HoloLens glasses. This augmented reality technology enables the visualisation of virtual objects superimposed on the real environment. These virtual objects include three-dimensional (3D) objects (namely 3D models of the skin and vascular structures) and two-dimensional (2D) objects [namely orthogonal views of computed tomography (CT) angiograms, 2D images of 3D vascular models, and 2D images of a new virtual angioscopy whose appearance of the vessel wall follows that shown in ex vivo and in vivo angioscopies]. Specific external markers were designed to be used as landmarks in the registration process to map the tracking data and radiological data into a common space. In addition, the use of real-time 3D ultrasound (US) is also under evaluation in the Nav EVAR project for guiding endovascular tools and updating navigation with intraoperative imaging. US volumes are streamed from the US system to HoloLens and visualised at a certain distance from the probe by tracking augmented reality markers. A human model torso that includes a 3D printed patient-specific aortic model was built to provide a realistic test environment for evaluation of technical components in the Nav EVAR project. The solutions presented in this study were tested by using an US training model and the aortic-aneurysm phantom. RESULTS: During the navigation of the catheter tip in the US training model, the 3D models of the phantom surface and vessels were visualised on HoloLens. In addition, a virtual angioscopy was also built from a CT scan of the aortic-aneurysm phantom. The external markers designed for this study were visible in the CT scan and the electromagnetically tracked pointer fitted in each marker hole. US volumes of the US training model were sent from the US system to HoloLens in order to display them, showing a latency of 259±86 ms (mean±standard deviation). CONCLUSION: The Nav EVAR project tackles the problem of radiation exposure and contrast agent administration during EVAR interventions by using a multidisciplinary approach to guide the endovascular tools. Its current state presents several limitations such as the rigid alignment between preoperative data and the simulated patient. Nevertheless, the techniques shown in this study in combination with fibre Bragg gratings and optical coherence tomography are a promising approach to overcome the problems of EVAR interventions.

SELECTION OF CITATIONS
SEARCH DETAIL