Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Article in English | MEDLINE | ID: mdl-37027713

ABSTRACT

In the embryonic human heart, complex dynamic shape changes take place in a short period of time on a microscopic scale, making this development difficult to visualize. However, spatial understanding of these processes is essential for students and future cardiologists to properly diagnose and treat congenital heart defects. Following a user centered approach, the most crucial embryological stages were identified and translated into a virtual reality learning environment (VRLE) to enable the understanding of the morphological transitions of these stages through advanced interactions. To address individual learning types, we implemented different features and evaluated the application regarding usability, perceived task load, and sense of presence in a user study. We also assessed spatial awareness and knowledge gain, and finally obtained feedback from domain experts. Overall, students and professionals rated the application positively. To minimize distraction from interactive learning content, such VRLEs should consider features for different learning types, allow for gradual habituation, and at the same time provide enough playful stimuli. Our work previews how VR can be integrated into a cardiac embryology education curriculum.

2.
Int J Comput Assist Radiol Surg ; 18(8): 1429-1436, 2023 Aug.
Article in English | MEDLINE | ID: mdl-36565368

ABSTRACT

PURPOSE: Past research contained the investigation and development of robotic ultrasound. In this context, interfaces which allow for interaction with the robotic system are of paramount importance. Few researchers have addressed the issue of developing non-tactile interaction approaches, although they could be beneficial for maintaining sterility during medical procedures. Interaction could be supported by multimodality, which has the potential to enable intuitive and natural interaction. To assess the feasibility of multimodal interaction for non-tactile control of a co-located robotic ultrasound system, a novel human-robot interaction concept was developed. METHODS: The medical use case of needle-based interventions under hybrid computed tomography and ultrasound imaging was analyzed by interviewing four radiologists. From the resulting workflow, interaction tasks were derived which include human-robot interaction. Based on this, characteristics of a multimodal, touchless human-robot interface were elaborated, suitable interaction modalities were identified, and a corresponding interface was developed, which was thereafter evaluated in a user study with eight participants. RESULTS: The implemented interface includes voice commands, combined with hand gesture control for discrete control and navigation interaction of the robotic US probe, respectively. The interaction concept was evaluated by the users in the form of a quantitative questionnaire with a average usability. Qualitative analysis of interview results revealed user satisfaction with the implemented interaction methods and potential improvements to the system. CONCLUSION: A multimodal, touchless interaction concept for a robotic US for the use case of needle-based procedures in interventional radiology was developed, incorporating combined voice and hand gesture control. Future steps will include the integration of a solution for the missing haptic feedback and the evaluation of its clinical suitability.


Subject(s)
Robotic Surgical Procedures , Robotics , Humans , User-Computer Interface , Gestures , Ultrasonography
3.
J Imaging ; 8(10)2022 Sep 21.
Article in English | MEDLINE | ID: mdl-36286350

ABSTRACT

Robotic assistance is applied in orthopedic interventions for pedicle screw placement (PSP). While current robots do not act autonomously, they are expected to have higher autonomy under surgeon supervision in the mid-term. Augmented reality (AR) is promising to support this supervision and to enable human-robot interaction (HRI). To outline a futuristic scenario for robotic PSP, the current workflow was analyzed through literature review and expert discussion. Based on this, a hypothetical workflow of the intervention was developed, which additionally contains the analysis of the necessary information exchange between human and robot. A video see-through AR prototype was designed and implemented. A robotic arm with an orthopedic drill mock-up simulated the robotic assistance. The AR prototype included a user interface to enable HRI. The interface provides data to facilitate understanding of the robot's "intentions", e.g., patient-specific CT images, the current workflow phase, or the next planned robot motion. Two-dimensional and three-dimensional visualization illustrated patient-specific medical data and the drilling process. The findings of this work contribute a valuable approach in terms of addressing future clinical needs and highlighting the importance of AR support for HRI.

SELECTION OF CITATIONS
SEARCH DETAIL
...