Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sci Rep ; 14(1): 9221, 2024 04 22.
Artigo em Inglês | MEDLINE | ID: mdl-38649681

RESUMO

Technological advances in head-mounted displays (HMDs) facilitate the acquisition of physiological data of the user, such as gaze, pupil size, or heart rate. Still, interactions with such systems can be prone to errors, including unintended behavior or unexpected changes in the presented virtual environments. In this study, we investigated if multimodal physiological data can be used to decode error processing, which has been studied, to date, with brain signals only. We examined the feasibility of decoding errors solely with pupil size data and proposed a hybrid decoding approach combining electroencephalographic (EEG) and pupillometric signals. Moreover, we analyzed if hybrid approaches can improve existing EEG-based classification approaches and focused on setups that offer increased usability for practical applications, such as the presented game-like virtual reality flight simulation. Our results indicate that classifiers trained with pupil size data can decode errors above chance. Moreover, hybrid approaches yielded improved performance compared to EEG-based decoders in setups with a reduced number of channels, which is crucial for many out-of-the-lab scenarios. These findings contribute to the development of hybrid brain-computer interfaces, particularly in combination with wearable devices, which allow for easy acquisition of additional physiological data.


Assuntos
Interfaces Cérebro-Computador , Eletroencefalografia , Pupila , Realidade Virtual , Humanos , Eletroencefalografia/métodos , Adulto , Masculino , Pupila/fisiologia , Feminino , Adulto Jovem , Simulação por Computador , Encéfalo/fisiologia , Frequência Cardíaca/fisiologia
2.
Artigo em Inglês | MEDLINE | ID: mdl-38083691

RESUMO

Algorithms detecting erroneous events, as used in brain-computer interfaces, usually rely solely on neural correlates of error perception. The increasing availability of wearable displays with built-in pupillometric sensors enables access to additional physiological data, potentially improving error detection. Hence, we measured both electroencephalographic (EEG) and pupillometric signals of 19 participants while performing a navigation task in an immersive virtual reality (VR) setting. We found EEG and pupillometric correlates of error perception and significant differences between distinct error types. Further, we found that actively performing tasks delays error perception. We believe that the results of this work could contribute to improving error detection, which has rarely been studied in the context of immersive VR.


Assuntos
Interfaces Cérebro-Computador , Realidade Virtual , Humanos , Simulação por Computador , Eletroencefalografia , Percepção
3.
Brain Lang ; 171: 62-71, 2017 08.
Artigo em Inglês | MEDLINE | ID: mdl-28535366

RESUMO

Communicative gestures can compensate incomprehensibility of oral speech in severe aphasia, but the brain damage that causes aphasia may also have an impact on the production of gestures. We compared the comprehensibility of gestural communication of persons with severe aphasia and non-aphasic persons and used voxel based lesion symptom mapping (VLSM) to determine lesion sites that are responsible for poor gestural expression in aphasia. On group level, persons with aphasia conveyed more information via gestures than controls indicating a compensatory use of gestures in persons with severe aphasia. However, individual analysis showed a broad range of gestural comprehensibility. VLSM suggested that poor gestural expression was associated with lesions in anterior temporal and inferior frontal regions. We hypothesize that likely functional correlates of these localizations are selection of and flexible changes between communication channels as well as between different types of gestures and between features of actions and objects that are expressed by gestures.


Assuntos
Afasia/patologia , Afasia/fisiopatologia , Comunicação , Compreensão/fisiologia , Gestos , Adulto , Idoso , Afasia/complicações , Lesões Encefálicas/complicações , Lesões Encefálicas/patologia , Lesões Encefálicas/fisiopatologia , Feminino , Lobo Frontal/patologia , Lobo Frontal/fisiopatologia , Humanos , Masculino , Pessoa de Meia-Idade , Lobo Temporal/patologia , Lobo Temporal/fisiopatologia
4.
Cortex ; 48(8): 952-62, 2012 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-21458789

RESUMO

Patients suffering from severe aphasia have to rely on non-verbal means of communication to convey a message. However, to date it is not clear which patients are able to do so. Clinical experience indicates that some patients use non-verbal communication strategies like gesturing very efficiently whereas others fail to transmit semantic content by non-verbal means. Concerns have been expressed that limb apraxia would affect the production of communicative gestures. Research investigating if and how apraxia influences the production of communicative gestures, led to contradictory outcomes. The purpose of this study was to investigate the impact of limb apraxia on spontaneous gesturing. Further, linguistic and non-verbal semantic processing abilities were explored as potential factors that might influence non-verbal expression in aphasic patients. Twenty-four aphasic patients with highly limited verbal output were asked to retell short video-clips. The narrations were videotaped. Gestural communication was analyzed in two ways. In the first part of the study, we used a form-based approach. Physiological and kinetic aspects of hand movements were transcribed with a notation system for sign languages. We determined the formal diversity of the hand gestures as an indicator of potential richness of the transmitted information. In the second part of the study, comprehensibility of the patients' gestural communication was evaluated by naive raters. The raters were familiarized with the model video-clips and shown the recordings of the patients' retelling without sound. They were asked to indicate, for each narration, which story was being told and which aspects of the stories they recognized. The results indicate that non-verbal faculties are the most important prerequisites for the production of hand gestures. Whereas results on standardized aphasia testing did not correlate with any gestural indices, non-verbal semantic processing abilities predicted the formal diversity of hand gestures while apraxia predicted the comprehensibility of gesturing.


Assuntos
Afasia , Apraxias , Gestos , Comunicação não Verbal/fisiologia , Semântica , Adulto , Idoso , Afasia/fisiopatologia , Apraxias/fisiopatologia , Feminino , Mãos , Humanos , Masculino , Pessoa de Meia-Idade , Língua de Sinais
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...