Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
Int J Comput Assist Radiol Surg ; 13(9): 1335-1344, 2018 Sep.
Article in English | MEDLINE | ID: mdl-29943226

ABSTRACT

PURPOSE: The discrepancy of continuously decreasing opportunities for clinical training and assessment and the increasing complexity of interventions in surgery has led to the development of different training and assessment options like anatomical models, computer-based simulators or cadaver trainings. However, trainees, following training, assessment and ultimately performing patient treatment, still face a steep learning curve. METHODS: To address this problem for C-arm-based surgery, we introduce a realistic radiation-free simulation system that combines patient-based 3D printed anatomy and simulated X-ray imaging using a physical C-arm. To explore the fidelity and usefulness of the proposed mixed-reality system for training and assessment, we conducted a user study with six surgical experts performing a facet joint injection on the simulator. RESULTS: In a technical evaluation, we show that our system simulates X-ray images accurately with an RMSE of 1.85 mm compared to real X-ray imaging. The participants expressed agreement with the overall realism of the simulation, the usefulness of the system for assessment and strong agreement with the usefulness of such a mixed-reality system for training of novices and experts. In a quantitative analysis, we furthermore evaluated the suitability of the system for the assessment of surgical skills and gather preliminary evidence for validity. CONCLUSION: The proposed mixed-reality simulation system facilitates a transition to C-arm-based surgery and has the potential to complement or even replace large parts of cadaver training, to provide a safe assessment environment and to reduce the risk for errors when proceeding to patient treatment. We propose an assessment concept and outline the steps necessary to expand the system into a test instrument that provides reliable and justified assessments scores indicative of surgical proficiency with sufficient evidence for validity.


Subject(s)
Lumbar Vertebrae/surgery , Models, Anatomic , Orthopedic Procedures/education , Simulation Training/methods , Surgery, Computer-Assisted/education , Tomography, X-Ray Computed/methods , User-Computer Interface , Cadaver , Clinical Competence , Humans , Learning Curve , Lumbar Vertebrae/diagnostic imaging , Male , Orthopedic Procedures/methods , Printing, Three-Dimensional , Surgery, Computer-Assisted/instrumentation
2.
Healthc Technol Lett ; 4(5): 179-183, 2017 Oct.
Article in English | MEDLINE | ID: mdl-29184661

ABSTRACT

Minimally invasive surgeries (MISs) are gaining popularity as alternatives to conventional open surgeries. In thoracoscopic scoliosis MIS, fluoroscopy is used to guide pedicle screw placement and to visualise the effect of the intervention on the spine curvature. However, cosmetic external appearance is the most important concern for patients, while correction of the spine and achieving coronal and sagittal trunk balance are the top priorities for surgeons. The authors present the feasibility study of the first intra-operative assistive system for scoliosis surgery composed of a single RGBD camera affixed on a C-arm which allows visualising in real time the surgery effects on the patient trunk surface in the transverse plane. They perform three feasibility experiments from simulated data based on scoliotic patients to live acquisition from non-scoliotic mannequin and person, all showing that the proposed system accuracy is comparable with scoliotic surface reconstruction state of art.

3.
Comput Biol Med ; 77: 135-47, 2016 10 01.
Article in English | MEDLINE | ID: mdl-27544070

ABSTRACT

X-ray is still the essential imaging for many minimally-invasive interventions. Overlaying X-ray images with an optical view of the surgery scene has been demonstrated to be an efficient way to reduce radiation exposure and surgery time. However, clinicians are recommended to place the X-ray source under the patient table while the optical view of the real scene must be captured from the top in order to see the patient, surgical tools, and the surgical site. With the help of a RGB-D (red-green-blue-depth) camera, which can measure depth in addition to color, the 3D model of the real scene is registered to the X-ray image. However, fusing two opposing viewpoints and visualizing them in the context of medical applications has never been attempted. In this paper, we propose first experiences of a novel inverse visualization technique for RGB-D augmented C-arms. A user study consisting of 16 participants demonstrated that our method shows a meaningful visualization with potential in providing clinicians multi-modal fused data in real-time during surgery.


Subject(s)
Imaging, Three-Dimensional/methods , Surgery, Computer-Assisted/methods , Color , Equipment Design , Hand/diagnostic imaging , Humans , Imaging, Three-Dimensional/instrumentation , Phantoms, Imaging , Video Recording
4.
Int J Comput Assist Radiol Surg ; 11(6): 1007-14, 2016 Jun.
Article in English | MEDLINE | ID: mdl-26995603

ABSTRACT

PURPOSE: In many orthopedic surgeries, there is a demand for correctly placing medical instruments (e.g., K-wire or drill) to perform bone fracture repairs. The main challenge is the mental alignment of X-ray images acquired using a C-arm, the medical instruments, and the patient, which dramatically increases in complexity during pelvic surgeries. Current solutions include the continuous acquisition of many intra-operative X-ray images from various views, which will result in high radiation exposure, long surgical durations, and significant effort and frustration for the surgical staff. This work conducts a preclinical usability study to test and evaluate mixed reality visualization techniques using intra-operative X-ray, optical, and RGBD imaging to augment the surgeon's view to assist accurate placement of tools. METHOD: We design and perform a usability study to compare the performance of surgeons and their task load using three different mixed reality systems during K-wire placements. The three systems are interventional X-ray imaging, X-ray augmentation on 2D video, and 3D surface reconstruction augmented by digitally reconstructed radiographs and live tool visualization. RESULTS: The evaluation criteria include duration, number of X-ray images acquired, placement accuracy, and the surgical task load, which are observed during 21 clinically relevant interventions performed by surgeons on phantoms. Finally, we test for statistically significant improvements and show that the mixed reality visualization leads to a significantly improved efficiency. CONCLUSION: The 3D visualization of patient, tool, and DRR shows clear advantages over the conventional X-ray imaging and provides intuitive feedback to place the medical tools correctly and efficiently.


Subject(s)
Bone Wires , Fracture Fixation, Internal/methods , Fractures, Bone/surgery , Pelvic Bones/surgery , Phantoms, Imaging , Radiography, Interventional/methods , Tomography, X-Ray Computed/methods , Fractures, Bone/diagnosis , Humans , Imaging, Three-Dimensional/methods , Pelvic Bones/diagnostic imaging , Pelvic Bones/injuries
5.
Int J Comput Assist Radiol Surg ; 11(6): 853-61, 2016 Jun.
Article in English | MEDLINE | ID: mdl-26984551

ABSTRACT

INTRODUCTION: In the modern day operating room, the surgeon performs surgeries with the support of different medical systems that showcase patient information, physiological data, and medical images. It is generally accepted that numerous interactions must be performed by the surgical team to control the corresponding medical system to retrieve the desired information. Joysticks and physical keys are still present in the operating room due to the disadvantages of mouses, and surgeons often communicate instructions to the surgical team when requiring information from a specific medical system. In this paper, a novel user interface is developed that allows the surgeon to personally perform touchless interaction with the various medical systems, switch effortlessly among them, all of this without modifying the systems' software and hardware. METHODS: To achieve this, a wearable RGB-D sensor is mounted on the surgeon's head for inside-out tracking of his/her finger with any of the medical systems' displays. Android devices with a special application are connected to the computers on which the medical systems are running, simulating a normal USB mouse and keyboard. When the surgeon performs interaction using pointing gestures, the desired cursor position in the targeted medical system display, and gestures, are transformed into general events and then sent to the corresponding Android device. Finally, the application running on the Android devices generates the corresponding mouse or keyboard events according to the targeted medical system. RESULTS AND CONCLUSION: To simulate an operating room setting, our unique user interface was tested by seven medical participants who performed several interactions with the visualization of CT, MRI, and fluoroscopy images at varying distances from them. Results from the system usability scale and NASA-TLX workload index indicated a strong acceptance of our proposed user interface.


Subject(s)
General Surgery/instrumentation , Operating Room Information Systems , Operating Rooms/methods , Surgical Procedures, Operative , User-Computer Interface , Attitude of Health Personnel , Equipment Design , Gestures , Humans , Software
6.
Int J Comput Assist Radiol Surg ; 11(8): 1385-95, 2016 Aug.
Article in English | MEDLINE | ID: mdl-26811080

ABSTRACT

PURPOSE: Calibration and registration are the first steps for augmented reality and mixed reality applications. In the medical field, the calibration between an RGB-D camera and a C-arm fluoroscope is a new topic which introduces challenges. METHOD: A convenient and efficient calibration phantom is designed by combining the traditional calibration object of X-ray images with a checkerboard plane. After the localization of the 2D marker points in the X-ray images and the corresponding 3D points from the RGB-D images, we calculate the projection matrix from the RGB-D sensor coordinates to the X-ray, instead of estimating the extrinsic and intrinsic parameters simultaneously. VALIDATION: In order to evaluate the effect of every step of our calibration process, we performed five experiments by combining different steps leading to the calibration. We also compared our calibration method to Tsai's method to evaluate the advancement of our solution. At last, we simulated the process of estimating the rotation movement of the RGB-D camera using MATLAB and demonstrate that calculating the projection matrix can reduce the angle error of the rotation. RESULTS: A RMS reprojection error of 0.5 mm is achieved using our calibration method which is promising for surgical applications. Our calibration method is more accurate when compared to Tsai's method. Lastly, the simulation result shows that using a projection matrix has a lower error than using intrinsic and extrinsic parameters in the rotation estimation. CONCLUSIONS: We designed and evaluated a 3D/2D calibration method for the combination of a RGB-D camera and a C-arm fluoroscope.


Subject(s)
Fluoroscopy/instrumentation , Surgery, Computer-Assisted/instrumentation , Calibration , Fluoroscopy/methods , Humans , Phantoms, Imaging
7.
Med Image Comput Comput Assist Interv ; 17(Pt 2): 659-66, 2014.
Article in English | MEDLINE | ID: mdl-25485436

ABSTRACT

We present the idea of a user interface concept, which resolves the challenges involved in the control of angiographic C-arms for their constant repositioning during interventions by either the surgeons or the surgical staff. Our aim is to shift the paradigm of interventional image acquisition workflow from the traditional control device interfaces to 'desired-view' control. This allows the physicians to only communicate the desired outcome of imaging, based on simulated X-rays from pre-operative CT or CTA data, while the system takes care of computing the positioning of the imaging device relative to the patient's anatomy through inverse kinematics and CT to patient registration. Together with our clinical partners, we evaluate the new technique using 5 patient CTA and their corresponding intraoperative X-ray angiography datasets.


Subject(s)
Angiography/methods , Aortic Aneurysm/diagnostic imaging , Image Enhancement/methods , Robotics/methods , Surgery, Computer-Assisted/methods , Tomography, X-Ray Computed/methods , User-Computer Interface , Algorithms , Humans , Image Interpretation, Computer-Assisted/methods , Reproducibility of Results , Robotics/instrumentation , Sensitivity and Specificity , Software , Surgery, Computer-Assisted/instrumentation , Tomography, X-Ray Computed/instrumentation
SELECTION OF CITATIONS
SEARCH DETAIL
...