Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
GMS J Med Educ ; 40(2): Doc18, 2023.
Article in English | MEDLINE | ID: mdl-37361242

ABSTRACT

Background: Medical students need to be prepared for various situations in clinical decision-making that cannot be systematically trained with real patients without risking their health or integrity. To target system-related limitations of actor-based training, digital learning methods are increasingly used in medical education, with virtual reality (VR)- training seeming to have high potential. Virtually generated training scenarios allow repetitive training of highly relevant clinical skills within a protected, realistic learning environment. Thanks to Artificial Intelligence (AI), face-to-face interaction with virtual agents is feasible. Combining this technology with VR-simulations offers a new way of situated context-based, first-person training for medical students. Project goal and method: The authors' aim is to develop a modular digital training platform for medical education with virtual, interactable agents and to integrate this platform into the medical curriculum. The medical tr.AI.ning platform will provide veridical simulation of clinical scenarios with virtual patients, augmented with highly realistic medical pathologies within a customizable, realistic situational context. Medical tr.AI.ning is scaled to four complementary developmental steps with different scenarios that can be used separately and so each outcome can successively be integrated early within the project. Every step has its own focus (visual, movement, communication, combination) and extends an author toolbox through its modularity. The modules of each step will be specified and designed together with medical didactics experts. Perspective: To ensure constant improvement of user experience, realism, and medical validity, the authors will perform regular iterative evaluation rounds.Furthermore, integration of medical tr.AI.ning into the medical curriculum will enable long-term and large-scale detection of benefits and limitations of this approach, providing enhanced alternative teaching paradigms for VR technology.


Subject(s)
Artificial Intelligence , Virtual Reality , Humans , Computer Simulation , Curriculum , Clinical Competence , Clinical Decision-Making
2.
Article in English | MEDLINE | ID: mdl-31352336

ABSTRACT

In this paper we demonstrate how thumb-to-finger tap interaction can be employed to perform eyes-free discrete symbolic input in virtual and augmented reality environments. Our DigiTap device is worn on the wrist to keep the hand free from any instrumentation such that tactile sense and dexterity are not impaired. DigiTap senses the jerk that is caused by a tap and takes an image sequence to detect the tap location. The device is able to recognize taps at twelve different locations on the fingers, and at some positions it can even distinguish between different tap strengths. We have conducted an extended user study to evaluate users' ability to interact with the device and to perform symbolic input.

3.
IEEE Comput Graph Appl ; 35(5): 42-54, 2015.
Article in English | MEDLINE | ID: mdl-26416361

ABSTRACT

Thumb-to-finger tap interaction can be employed to perform eyes-free, discrete, symbolic input in virtual and augmented reality environments. The DigiTap device is worn on the wrist to keep the hand free from any instrumentation so as not to impair tactile sense and dexterity. DigiTap senses the jerk that is caused by a tap and takes an image sequence to detect the tap location. The device can recognize taps at 12 different locations on the fingers, and at some positions, it can even distinguish between different tap strengths. The authors conducted an extended user study to evaluate users' abilities to interact with the device and perform symbolic input.


Subject(s)
Accelerometry/instrumentation , Computer Peripherals , Fingers/physiology , Imaging, Three-Dimensional/instrumentation , Touch/physiology , User-Computer Interface , Equipment Design , Equipment Failure Analysis , Humans , Lighting/instrumentation , Monitoring, Ambulatory/instrumentation , Photometry/instrumentation , Transducers
4.
PLoS One ; 8(1): e53963, 2013.
Article in English | MEDLINE | ID: mdl-23349775

ABSTRACT

We designed a novel imaging technique based on frustrated total internal reflection (FTIR) to obtain high resolution and high contrast movies. This FTIR-based Imaging Method (FIM) is suitable for a wide range of biological applications and a wide range of organisms. It operates at all wavelengths permitting the in vivo detection of fluorescent proteins. To demonstrate the benefits of FIM, we analyzed large groups of crawling Drosophila larvae. The number of analyzable locomotion tracks was increased by implementing a new software module capable of preserving larval identity during most collision events. This module is integrated in our new tracking program named FIMTrack which subsequently extracts a number of features required for the analysis of complex locomotion phenotypes. FIM enables high throughput screening for even subtle behavioral phenotypes. We tested this newly developed setup by analyzing locomotion deficits caused by the glial knockdown of several genes. Suppression of kinesin heavy chain (khc) or rab30 function led to contraction pattern or head sweeping defects, which escaped in previous analysis. Thus, FIM permits forward genetic screens aimed to unravel the neural basis of behavior.


Subject(s)
Drosophila Proteins/physiology , Drosophila/physiology , Locomotion/physiology , Video Recording/methods , Animals , Drosophila/genetics , Drosophila Proteins/genetics , Female , Kinesins/genetics , Kinesins/physiology , Larva/genetics , Larva/physiology , Locomotion/genetics , Male , Monomeric GTP-Binding Proteins/genetics , Monomeric GTP-Binding Proteins/physiology , Neuroglia/metabolism , RNA Interference , Reproducibility of Results
SELECTION OF CITATIONS
SEARCH DETAIL
...