Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
iScience ; 26(11): 108099, 2023 Nov 17.
Artigo em Inglês | MEDLINE | ID: mdl-37920667

RESUMO

Humans exhibit a strong tendency to synchronize movements with each other, with visual perspective potentially influencing interpersonal synchronization. By manipulating the visual scenes of participants engaged in a joint finger-tapping task, we examined the effects of 1st person and 2nd person visual perspectives on their coordination dynamics. We hypothesized that perceiving the partner's movements from their 1st person perspective would enhance spontaneous interpersonal synchronization, potentially mediated by the embodiment of the partner's hand. We observed significant differences in attractor dynamics across visual perspectives. Specifically, participants in 1st person coupling were unable to maintain de-coupled trajectories as effectively as in 2nd person coupling. Our findings suggest that visual perspective influences coordination dynamics in dyadic interactions, engaging error-correction mechanisms in individual brains as they integrate the partner's hand into their body representation. Our results have the potential to inform the development of applications for motor training and rehabilitation.

2.
Data Brief ; 51: 109663, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37869620

RESUMO

This dataset comprises motion capture, audio, and questionnaire data from violinists who underwent four augmented reality training sessions spanning a month. The motion capture data was meticulously recorded using a 42-marker Qualisys Animation marker set, capturing movement at a high rate of 120 Hz. Audio data was captured using two condenser microphones, boasting a bit depth of 24 and a sampling rate of 48 kHz. The dataset encompasses recordings from 2 violin orchestra section leaders and 11 participants. Initially, we collected motion capture (MoCap) and audio data from the section leaders, who performed 2 distinct musical pieces. These recordings were then utilized to create 2 avatars, each representing a section leader and their respective musical piece. Subsequently, each avatar was assigned to a group of violinists, forming groups of 5 and 6 participants. Throughout the experiment, participants rehearsed one piece four times using a 2D representation of the avatar, and the other piece four times using a 3D representation. During the practice sessions, participants were instructed to meticulously replicate the avatar's bowing techniques, encompassing gestures related to bowing, articulation, and dynamics. For each trial, we collected motion capture, audio data, and self-reported questionnaires from all participants. The questionnaires included the Witmer presence questionnaire, a subset of the Makransky presence questionnaire, the sense of musical agency questionnaire, as well as open-ended questions for participants to express their thoughts and experiences. Additionally, participants completed the Immersive Tendencies questionnaire, the Music Sophistication Index questionnaire, and provided demographic information before the first session commenced.

3.
Comput Human Behav ; 146: 107810, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-37663430

RESUMO

The acquisition of advanced gestures is a challenge in various domains of proficient sensorimotor performance. For example, orchestral violinists must move in sync with the lead violinist's gestures. To help train these gestures, an educational music play-back system was developed using a HoloLens 2 simulated AR environment and an avatar representation of the lead violinist. This study aimed to investigate the impact of using a 2D or 3D representation of the lead violinist's avatar on students' learning experience in the AR environment. To assess the learning outcome, the study employed a longitudinal experiment design, in which eleven participants practiced two pieces of music in four trials, evenly spaced over a month. Participants were asked to mimic the avatar's gestures as closely as possible when it came to using the bow, including bowing, articulations, and dynamics. The study compared the similarities between the avatar's gestures and those of the participants at the biomechanical level, using motion capture measurements, as well as the smoothness of the participants' movements. Additionally, presence and perceived difficulty were assessed using questionnaires. The results suggest that using a 3D representation of the avatar leads to better gesture resemblance and a higher experience of presence compared to a 2D representation. The 2D representation, however, showed a learning effect, but this was not observed in the 3D condition. The findings suggest that the 3D condition benefits from stereoscopic information that enhances spatial cognition, making it more effective in relation to sensorimotor performance. Overall, the 3D condition had a greater impact on performance than on learning. This work concludes with recommendations for future efforts directed towards AR-based advanced gesture training to address the challenges related to measurement methodology and participants' feedback on the AR application.

4.
Front Psychol ; 12: 663725, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34177720

RESUMO

Virtual reality (VR) brings radical new possibilities to the empirical study of social music cognition and interaction. In the present article, we consider the role of VR as a research tool, based on its potential to create a sense of "social presence": the illusory feeling of being, and socially interacting, inside a virtual environment. This makes VR promising for bridging ecological validity ("research in the wild") and experimental control ("research in the lab") in empirical music research. A critical assumption however is the actual ability of VR to simulate real-life social interactions, either via human-embodied avatars or computer-controlled agents. The mediation of social musical interactions via VR is particularly challenging due to their embodied, complex, and emotionally delicate nature. In this article, we introduce a methodological framework to operationalize social presence by a combination of factors across interrelated layers, relating to the performance output, embodied co-regulation, and subjective experiences. This framework provides the basis for the proposal of a pragmatic approach to determine the level of social presence in virtual musical interactions, by comparing the outcomes across the multiple layers with the outcomes of corresponding real-life musical interactions. We applied and tested this pragmatic approach via a case-study of piano duet performances of the piece Piano Phase composed by Steve Reich. This case-study indicated that a piano duet performed in VR, in which the real-time interaction between pianists is mediated by embodied avatars, might lead to a strong feeling of social presence, as reflected in the measures of performance output, embodied co-regulation, and subjective experience. In contrast, although a piano duet in VR between an actual pianist and a computer-controlled agent led to a relatively successful performance output, it was inadequate in terms of both embodied co-regulation and subjective experience.

5.
Front Psychol ; 12: 647929, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34108911

RESUMO

Musical life became disrupted in 2020 due to the COVID-19 pandemic. Many musicians and venues turned to online alternatives, such as livestreaming. In this study, three livestreamed concerts were organized to examine separate, yet interconnected concepts-agency, presence, and social context-to ascertain which components of livestreamed concerts facilitate social connectedness. Hierarchical Bayesian modeling was conducted on 83 complete responses to examine the effects of the manipulations on feelings of social connectedness with the artist and the audience. Results showed that in concert 1, where half of the participants were allowed to vote for the final song to be played, this option did not result in the experience of more agency. Instead, if their preferred song was played (regardless of voting ability) participants experienced greater connectedness to the artist. In concert 2, participants who attended the concert with virtual reality headsets experienced greater feelings of physical presence, as well as greater feelings of connectedness with the artist, than those that viewed a normal YouTube livestream. In concert 3, attendance through Zoom led to greater experience of social presence, but predicted less connectedness with the artist, compared to a normal YouTube livestream. Crucially, a greater negative impact of COVID-19 (e.g., loneliness) predicted feelings of connectedness with the artist, possibly because participants fulfilled their social needs with this parasocial interaction. Examining data from all concerts suggested that physical presence was a predictor of connectedness with both the artist and the audience, while social presence only predicted connectedness with the audience. Correlational analyses revealed that reductions in loneliness and isolation were associated with feelings of shared agency, physical and social presence, and connectedness to the audience. Overall, the findings suggest that in order to reduce feelings of loneliness and increase connectedness, concert organizers and musicians could tune elements of their livestreams to facilitate feelings of physical and social presence.

6.
Front Psychol ; 12: 623110, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-33912105

RESUMO

Since sound and music are powerful forces and drivers of human behavior and physiology, we propose the use of sonification to activate healthy breathing patterns in participants to induce relaxation. Sonification is often used in the context of biofeedback as it can represent an informational, non-invasive and real-time stimulus to monitor, motivate or modify human behavior. The first goal of this study is the proposal and evaluation of a distance-based biofeedback system using a tempo- and phase-aligned sonification strategy to adapt breathing patterns and induce states of relaxation. A second goal is the evaluation of several sonification stimuli on 18 participants that were recruited online and of which we analyzed psychometric and behavioral data using, respectively questionnaires and respiration rate and ratio. Sonification stimuli consisted of filtered noise mimicking a breathing sound, nature environmental sounds and a musical phrase. Preliminary results indicated the nature stimulus as most pleasant and as leading to the most prominent decrease of respiration rate. The noise sonification had the most beneficial effect on respiration ratio. While further research is needed to generalize these findings, this study and its methodological underpinnings suggest the potential of the proposed biofeedback system to perform ecologically valid experiments at participants' homes during the COVID-19 pandemic.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...