Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Front Neurosci ; 10: 219, 2016.
Article in English | MEDLINE | ID: mdl-27303255

ABSTRACT

Motor learning is based on motor perception and emergent perceptual-motor representations. A lot of behavioral research is related to single perceptual modalities but during last two decades the contribution of multimodal perception on motor behavior was discovered more and more. A growing number of studies indicates an enhanced impact of multimodal stimuli on motor perception, motor control and motor learning in terms of better precision and higher reliability of the related actions. Behavioral research is supported by neurophysiological data, revealing that multisensory integration supports motor control and learning. But the overwhelming part of both research lines is dedicated to basic research. Besides research in the domains of music, dance and motor rehabilitation, there is almost no evidence for enhanced effectiveness of multisensory information on learning of gross motor skills. To reduce this gap, movement sonification is used here in applied research on motor learning in sports. Based on the current knowledge on the multimodal organization of the perceptual system, we generate additional real-time movement information being suitable for integration with perceptual feedback streams of visual and proprioceptive modality. With ongoing training, synchronously processed auditory information should be initially integrated into the emerging internal models, enhancing the efficacy of motor learning. This is achieved by a direct mapping of kinematic and dynamic motion parameters to electronic sounds, resulting in continuous auditory and convergent audiovisual or audio-proprioceptive stimulus arrays. In sharp contrast to other approaches using acoustic information as error-feedback in motor learning settings, we try to generate additional movement information suitable for acceleration and enhancement of adequate sensorimotor representations and processible below the level of consciousness. In the experimental setting, participants were asked to learn a closed motor skill (technique acquisition of indoor rowing). One group was treated with visual information and two groups with audiovisual information (sonification vs. natural sounds). For all three groups learning became evident and remained stable. Participants treated with additional movement sonification showed better performance compared to both other groups. Results indicate that movement sonification enhances motor learning of a complex gross motor skill-even exceeding usually expected acoustic rhythmic effects on motor learning.

2.
Multisens Res ; 26(6): 533-52, 2013.
Article in English | MEDLINE | ID: mdl-24800411

ABSTRACT

Although visual perception is dominant on motor perception, control and learning, auditory information can enhance and modulate perceptual as well as motor processes in a multifaceted manner. During last decades new methods of auditory augmentation had been developed with movement sonification as one of the most recent approaches expanding auditory movement information also to usually mute phases of movement. Despite general evidence on the effectiveness of movement sonification in different fields of applied research there is nearly no empirical proof on how sonification of gross motor human movement should be configured to achieve information rich sound sequences. Such lack of empirical proof is given for (a) the selection of suitable movement features as well as for (b) effective kinetic-acoustical mapping patterns and for (c) the number of regarded dimensions of sonification. In this study we explore the informational content of artificial acoustical kinematics in terms of a kinematic movement sonification using an intermodal discrimination paradigm. In a repeated measure design we analysed discrimination rates of six everyday upper limb actions to evaluate the effectiveness of seven different kinds of kinematic-acoustical mappings as well as short-term learning effects. The kinematics of the upper limb actions were calculated based on inertial motion sensor data and transformed into seven different sonifications. Sound sequences were randomly presented to participants and discrimination rates as well as confidence of choice were analysed. Data indicate an instantaneous comprehensibility of the artificial movement acoustics as well as short-term learning effects. No differences between different dimensional encodings became evident thus indicating a high efficiency for intermodal pattern discrimination for the acoustically coded velocity distribution of the actions. Taken together movement information related to continuous kinematic parameters can be transformed into the auditory domain. Additionally, pattern based action discrimination is obviously not restricted to the visual modality. Artificial acoustical kinematics might be used to supplement and/or substitute visual motion perception in sports and motor rehabilitation.


Subject(s)
Auditory Perception/physiology , Motion Perception/physiology , Movement/physiology , Phonetics , Speech Acoustics , Adult , Biomechanical Phenomena , Female , Humans , Male , Psychomotor Performance/physiology , Young Adult
3.
Brain Res ; 1252: 94-104, 2009 Feb 03.
Article in English | MEDLINE | ID: mdl-19083992

ABSTRACT

Audio-visual integration in the human brain influences perception and precision of motor tasks. We tested audio-visual integration during height estimation when presenting video clips of counter movement jumps (CMJ), using sparse sampling fMRI at 3T. Employing the technique of "sonification", we created artificial auditory-visual motion events by transforming the ground reaction force of the CMJs into the auditory domain, modulating frequency and amplitude of the standard pitch "A" (440 Hz). We combined these "sonificated" movements with either concordant or discordant visual movement displays. We hypothesized that processing of concordant audio-visual stimuli would enhance neural activity in audio-visual integration areas. Therefore, four conditions were compared: 1. unimodal visual, 2. unimodal auditory, 3. auditory+visual concordant, and 4. auditory+visual discordant. The unimodal conditions, when compared against each other, resulted in expected activation maxima in primary visual and auditory cortex, respectively. Enhanced activation was found in area V5/MT bilaterally for the concordant multimodal, as compared to both unimodal, conditions. This effect was specific for the concordant bimodal condition, as evidenced by a direct comparison between concordant and discordant bimodal conditions. Using "sonification", we provide evidence that area V5/MT is modulated by concordant auditory input, albeit the artificial nature of the stimuli, which argues for a role of this region in multimodal motion integration, beyond the pure visual domain. This may explain previous behavioral evidence of facilitatory effects exerted by auditory motion stimuli on the perception of visual motion, and may provide the basis for future applications in motor learning and rehabilitation.


Subject(s)
Auditory Perception/physiology , Motion Perception/physiology , Occipital Lobe/physiology , Adult , Brain/physiology , Brain Mapping , Female , Humans , Magnetic Resonance Imaging , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...