Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Cogn Sci ; 45(6): e12989, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-34170013

RESUMO

Gestures and speech are clearly synchronized in many ways. However, previous studies have shown that the semantic similarity between gestures and speech breaks down as people approach transitions in understanding. Explanations for these gesture-speech mismatches, which focus on gestures and speech expressing different cognitive strategies, have been criticized for disregarding gestures' and speech's integration and synchronization. In the current study, we applied three different perspectives to investigate gesture-speech synchronization in an easy and a difficult task: temporal alignment, semantic similarity, and complexity matching. Participants engaged in a simple cognitive task and were assigned to either an easy or a difficult condition. We automatically measured pointing gestures, and we coded participant's speech, to determine the temporal alignment and semantic similarity between gestures and speech. Multifractal detrended fluctuation analysis was used to determine the extent of complexity matching between gestures and speech. We found that task difficulty indeed influenced gesture-speech synchronization in all three domains. We thereby extended the phenomenon of gesture-speech mismatches to difficult tasks in general. Furthermore, we investigated how temporal alignment, semantic similarity, and complexity matching were related in each condition, and how they predicted participants' task performance. Our study illustrates how combining multiple perspectives, originating from different research areas (i.e., coordination dynamics, complexity science, cognitive psychology), provides novel understanding about cognitive concepts in general and about gesture-speech synchronization and task difficulty in particular.


Assuntos
Gestos , Fala , Humanos , Semântica
2.
Ann N Y Acad Sci ; 1491(1): 89-105, 2021 05.
Artigo em Inglês | MEDLINE | ID: mdl-33336809

RESUMO

It is commonly understood that hand gesture and speech coordination in humans is culturally and cognitively acquired, rather than having a biological basis. Recently, however, the biomechanical physical coupling of arm movements to speech vocalization has been studied in steady-state vocalization and monosyllabic utterances, where forces produced during gesturing are transferred onto the tensioned body, leading to changes in respiratory-related activity and thereby affecting vocalization F0 and intensity. In the current experiment (n = 37), we extend this previous line of work to show that gesture-speech physics also impacts fluent speech. Compared with nonmovement, participants who are producing fluent self-formulated speech while rhythmically moving their limbs demonstrate heightened F0 and amplitude envelope, and such effects are more pronounced for higher-impulse arm versus lower-impulse wrist movement. We replicate that acoustic peaks arise especially during moments of peak impulse (i.e., the beat) of the movement, namely around deceleration phases of the movement. Finally, higher deceleration rates of higher-mass arm movements were related to higher peaks in acoustics. These results confirm a role for physical impulses of gesture affecting the speech system. We discuss the implications of gesture-speech physics for understanding of the emergence of communicative gesture, both ontogenetically and phylogenetically.


Assuntos
Gestos , Movimento/fisiologia , Fala/fisiologia , Adolescente , Fenômenos Biomecânicos , Feminino , Humanos , Masculino , Percepção de Movimento/fisiologia , Acústica da Fala , Adulto Jovem
3.
Acta Psychol (Amst) ; 211: 103187, 2020 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-33075690

RESUMO

Children move their hands to explore, learn and communicate about hands-on tasks. Their hand movements seem to be "learning" ahead of speech. Children shape their hand movements in accordance with spatial and temporal task properties, such as when they feel an object or simulate its movements. Their speech does not directly correspond to these spatial and temporal task properties, however. We aimed to understand whether and how hand movements' are leading cognitive development due to their ability to correspond to spatiotemporal task properties, while speech is unable to do so. We explored whether hand movements' and speech's variability changed with a change in spatiotemporal task properties, using two variability measures: Diversity indicates adaptation, while Complexity indicates flexibility to adapt. In two experiments, we asked children (4-7 years) to predict and explain about balance scale problems, whereby we either manipulated the length of the balance scale or the mass of the weights after half of the trials. In three out of four conditions, we found a change in Complexity for both hand movements and speech between first and second half of the task. In one of these conditions, we found a relation between the differences in Complexity and Diversity of hand movements and speech. Changes in spatiotemporal task properties thus often influenced both hand movements' and speech's flexibility, but there seem to be differences in how they did so. We provided many directions for future research, to further unravel the relations between hand movements, speech, task properties, variability, and cognitive development.


Assuntos
Cognição , Mãos , Movimento , Fala , Criança , Humanos , Desempenho Psicomotor
4.
Front Psychol ; 7: 473, 2016.
Artigo em Inglês | MEDLINE | ID: mdl-27065933

RESUMO

As children learn they use their speech to express words and their hands to gesture. This study investigates the interplay between real-time gestures and speech as children construct cognitive understanding during a hands-on science task. 12 children (M = 6, F = 6) from Kindergarten (n = 5) and first grade (n = 7) participated in this study. Each verbal utterance and gesture during the task were coded, on a complexity scale derived from dynamic skill theory. To explore the interplay between speech and gestures, we applied a cross recurrence quantification analysis (CRQA) to the two coupled time series of the skill levels of verbalizations and gestures. The analysis focused on (1) the temporal relation between gestures and speech, (2) the relative strength and direction of the interaction between gestures and speech, (3) the relative strength and direction between gestures and speech for different levels of understanding, and (4) relations between CRQA measures and other child characteristics. The results show that older and younger children differ in the (temporal) asymmetry in the gestures-speech interaction. For younger children, the balance leans more toward gestures leading speech in time, while the balance leans more toward speech leading gestures for older children. Secondly, at the group level, speech attracts gestures in a more dynamically stable fashion than vice versa, and this asymmetry in gestures and speech extends to lower and higher understanding levels. Yet, for older children, the mutual coupling between gestures and speech is more dynamically stable regarding the higher understanding levels. Gestures and speech are more synchronized in time as children are older. A higher score on schools' language tests is related to speech attracting gestures more rigidly and more asymmetry between gestures and speech, only for the less difficult understanding levels. A higher score on math or past science tasks is related to less asymmetry between gestures and speech. The picture that emerges from our analyses suggests that the relation between gestures, speech and cognition is more complex than previously thought. We suggest that temporal differences and asymmetry in influence between gestures and speech arise from simultaneous coordination of synergies.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...