Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sci Data ; 11(1): 132, 2024 Jan 25.
Artigo em Inglês | MEDLINE | ID: mdl-38272936

RESUMO

Investigating emotions relies on pre-validated stimuli to evaluate induced responses through subjective self-ratings and physiological changes. The creation of precise affect models necessitates extensive datasets. While datasets related to pictures, words, and sounds are abundant, those associated with videos are comparatively scarce. To overcome this challenge, we present the first virtual reality (VR) database with continuous self-ratings and physiological measures, including facial EMG. Videos were rated online using a head-mounted VR device (HMD) with attached emteqPRO mask and a cinema VR environment in remote home and laboratory settings with minimal setup requirements. This led to an affective video database with continuous valence and arousal self-rating measures and physiological responses (PPG, facial-EMG (7x), IMU). The AVDOS-VR database includes data from 37 participants who watched 30 randomly ordered videos (10 positive, neutral, and negative). Each 30-second video was assessed with two-minute relaxation between categories. Validation results suggest that remote data collection is ecologically valid, providing an effective strategy for future affective study designs. All data can be accessed via: www.gnacek.com/affective-video-database-online-study .

2.
Front Psychiatry ; 14: 1232433, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37614653

RESUMO

Background: Continuous assessment of affective behaviors could improve the diagnosis, assessment and monitoring of chronic mental health and neurological conditions such as depression. However, there are no technologies well suited to this, limiting potential clinical applications. Aim: To test if we could replicate previous evidence of hypo reactivity to emotional salient material using an entirely new sensing technique called optomyography which is well suited to remote monitoring. Methods: Thirty-eight depressed and 37 controls (≥18, ≤40 years) who met a research diagnosis of depression and an age-matched non-depressed control group. Changes in facial muscle activity over the brow (corrugator supercilli) and cheek (zygomaticus major) were measured whilst volunteers watched videos varying in emotional salience. Results: Across all participants, videos rated as subjectively positive were associated with activation of muscles in the cheek relative to videos rated as neutral or negative. Videos rated as subjectively negative were associated with brow activation relative to videos judged as neutral or positive. Self-reported arousal was associated with a step increase in facial muscle activation across the brow and cheek. Group differences were significantly reduced activation in facial muscles during videos considered subjectively negative or rated as high arousal in depressed volunteers compared with controls. Conclusion: We demonstrate for the first time that it is possible to detect facial expression hypo-reactivity in adults with depression in response to emotional content using glasses-based optomyography sensing. It is hoped these results may encourage the use of optomyography-based sensing to track facial expressions in the real-world, outside of a specialized testing environment.

3.
PLoS One ; 18(4): e0278065, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37053205

RESUMO

This paper describes the development and validation of 3D Affective Virtual environments and Event Library (AVEL) for affect induction in Virtual Reality (VR) settings with an online survey; a cost-effective method for remote stimuli validation which has not been sufficiently explored. Three virtual office-replica environments were designed to induce negative, neutral and positive valence. Each virtual environment also had several affect inducing events/objects. The environments were validated using an online survey containing videos of the virtual environments and pictures of the events/objects. They survey was conducted with 67 participants. Participants were instructed to rate their perceived levels of valence and arousal for each virtual environment (VE), and separately for each event/object. They also rated their perceived levels of presence for each VE, and they were asked how well they remembered the events/objects presented in each VE. Finally, an alexithymia questionnaire was administered at the end of the survey. User ratings were analysed and successfully validated the expected affect and presence levels of each VE and affect ratings for each event/object. Our results demonstrate the effectiveness of the online validation of VE material in affective and cognitive neuroscience and wider research settings as a good scientific practice for future affect induction VR studies.


Assuntos
Meio Ambiente , Realidade Virtual , Humanos , Nível de Alerta , Interface Usuário-Computador , Rememoração Mental
4.
Sci Rep ; 12(1): 16876, 2022 10 07.
Artigo em Inglês | MEDLINE | ID: mdl-36207524

RESUMO

Using a novel wearable surface electromyography (sEMG), we investigated induced affective states by measuring the activation of facial muscles traditionally associated with positive (left/right orbicularis and left/right zygomaticus) and negative expressions (the corrugator muscle). In a sample of 38 participants that watched 25 affective videos in a virtual reality environment, we found that each of the three variables examined-subjective valence, subjective arousal, and objective valence measured via the validated video types (positive, neutral, and negative)-sEMG amplitude varied significantly depending on video content. sEMG aptitude from "positive muscles" increased when participants were exposed to positively valenced stimuli compared with stimuli that was negatively valenced. In contrast, activation of "negative muscles" was elevated following exposure to negatively valenced stimuli compared with positively valenced stimuli. High arousal videos increased muscle activations compared to low arousal videos in all the measured muscles except the corrugator muscle. In line with previous research, the relationship between sEMG amplitude as a function of subjective valence was V-shaped.


Assuntos
Músculos Faciais , Dispositivos Eletrônicos Vestíveis , Afeto/fisiologia , Nível de Alerta/fisiologia , Eletromiografia , Emoções/fisiologia , Face/fisiologia , Expressão Facial , Músculos Faciais/fisiologia , Humanos
5.
Sensors (Basel) ; 22(6)2022 Mar 08.
Artigo em Inglês | MEDLINE | ID: mdl-35336250

RESUMO

Breathing rate is considered one of the fundamental vital signs and a highly informative indicator of physiological state. Given that the monitoring of heart activity is less complex than the monitoring of breathing, a variety of algorithms have been developed to estimate breathing activity from heart activity. However, estimating breathing rate from heart activity outside of laboratory conditions is still a challenge. The challenge is even greater when new wearable devices with novel sensor placements are being used. In this paper, we present a novel algorithm for breathing rate estimation from photoplethysmography (PPG) data acquired from a head-worn virtual reality mask equipped with a PPG sensor placed on the forehead of a subject. The algorithm is based on advanced signal processing and machine learning techniques and includes a novel quality assessment and motion artifacts removal procedure. The proposed algorithm is evaluated and compared to existing approaches from the related work using two separate datasets that contains data from a total of 37 subjects overall. Numerous experiments show that the proposed algorithm outperforms the compared algorithms, achieving a mean absolute error of 1.38 breaths per minute and a Pearson's correlation coefficient of 0.86. These results indicate that reliable estimation of breathing rate is possible based on PPG data acquired from a head-worn device.


Assuntos
Fotopletismografia , Taxa Respiratória , Frequência Cardíaca/fisiologia , Humanos , Aprendizado de Máquina , Fotopletismografia/métodos , Processamento de Sinais Assistido por Computador
6.
Pain Manag ; 10(6): 399-410, 2020 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-33073690

RESUMO

Aim: Assessing pain perception through self-reports may not be possible in some patients, for example, sedated. Our group considered if facial electromyography (fEMG) could provide a useful alternative, by testing on healthy participants subjected to experimental pain. Materials & methods: Activity of four facial muscles was recorded using fEMG alongside self-reported pain scores and physiological parameters. Results: The pain stimulus elicited significant activity on all facial muscles of interest as well as increases in heart rate. Activity from two of the facial muscles correlated significantly against pain intensity. Conclusion: Pain perception can be assessed through fEMG on healthy participants. We believe that this model would be valuable to clinicians that need to diagnose pain perception in circumstances where verbal reporting is not possible.


Assuntos
Músculos Faciais , Dor Facial , Eletromiografia , Dor Facial/diagnóstico , Voluntários Saudáveis , Frequência Cardíaca , Humanos
7.
IEEE Trans Vis Comput Graph ; 23(4): 1312-1321, 2017 04.
Artigo em Inglês | MEDLINE | ID: mdl-28141522

RESUMO

In immersive Virtual Reality systems, users tend to move in a Virtual Environment as they would in an analogous physical environment. In this work, we investigated how user behaviour is affected when the Virtual Environment differs from the physical space. We created two sets of four environments each, plus a virtual replica of the physical environment as a baseline. The first focused on aesthetic discrepancies, such as a water surface in place of solid ground. The second focused on mixing immaterial objects together with those paired to tangible objects. For example, barring an area with walls or obstacles. We designed a study where participants had to reach three waypoints laid out in such a way to prompt a decision on which path to follow based on the conflict between the mismatching visual stimuli and their awareness of the real layout of the room. We analysed their performances to determine whether their trajectories were altered significantly from the shortest route. Our results indicate that participants altered their trajectories in presence of surfaces representing higher walking difficulty (for example, water instead of grass). However, when the graphical appearance was found to be ambiguous, there was no significant trajectory alteration. The environments mixing immaterial with physical objects had the most impact on trajectories with a mean deviation from the shortest route of 60 cm against the 37 cm of environments with aesthetic alterations. The co-existance of paired and unpaired virtual objects was reported to support the idea that all objects participants saw were backed by physical props. From these results and our observations, we derive guidelines on how to alter user movement behaviour in Virtual Environments.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...