Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
PLoS One ; 17(12): e0278378, 2022.
Article in English | MEDLINE | ID: mdl-36542635

ABSTRACT

Early life environments afford infants a variety of learning opportunities, and caregivers play a fundamental role in shaping infant early life experience. Variation in maternal attitudes and parenting practices is likely to be greater between than within cultures. However, there is limited cross-cultural work characterising how early life environment differs across populations. We examined the early life environment of infants from two cultural contexts where attitudes towards parenting and infant development were expected to differ: in a group of 53 mother-infant dyads in the UK and 44 mother-infant dyads in Uganda. Participants were studied longitudinally from when infants were 3- to 15-months-old. Questionnaire data revealed the Ugandan mothers had more relational attitudes towards parenting than the mothers from the UK, who had more autonomous parenting attitudes. Using questionnaires and observational methods, we examined whether infant development and experience aligned with maternal attitudes. We found the Ugandan infants experienced a more relational upbringing than the UK infants, with Ugandan infants receiving more distributed caregiving, more body contact with their mothers, and more proximity to mothers at night. Ugandan infants also showed earlier physical development compared to UK infants. Contrary to our expectations, however, Ugandan infants were not in closer proximity to their mothers during the day, did not have more people in proximity or more partners for social interaction compared to UK infants. In addition, when we examined attitudes towards specific behaviours, mothers' attitudes rarely predicted infant experience in related contexts. Taken together our findings highlight the importance of measuring behaviour, rather than extrapolating expected behaviour based on attitudes alone. We found infants' early life environment varies cross-culturally in many important ways and future research should investigate the consequences of these differences for later development.


Subject(s)
Cross-Cultural Comparison , Mother-Child Relations , Female , Humans , Infant , Attitude , Mothers , Parenting , Surveys and Questionnaires
2.
Proc Natl Acad Sci U S A ; 119(47): e2206486119, 2022 11 22.
Article in English | MEDLINE | ID: mdl-36375066

ABSTRACT

Humans are argued to be unique in their ability and motivation to share attention with others about external entities-sharing attention for sharing's sake. Indeed, in humans, using referential gestures declaratively to direct the attention of others toward external objects and events emerges in the first year of life. In contrast, wild great apes seldom use referential gestures, and when they do, it seems to be exclusively for imperative purposes. This apparent species difference has fueled the argument that the motivation and ability to share attention with others is a human-specific trait with important downstream consequences for the evolution of our complex cognition [M. Tomasello, Becoming Human (2019)]. Here, we report evidence of a wild ape showing a conspecific an item of interest. We provide video evidence of an adult female chimpanzee, Fiona, showing a leaf to her mother, Sutherland, in the context of leaf grooming in Kibale Forest, Uganda. We use a dataset of 84 similar leaf-grooming events to explore alternative explanations for the behavior, including food sharing and initiating dyadic grooming or playing. Our observations suggest that in highly specific social conditions, wild chimpanzees, like humans, may use referential showing gestures to direct others' attention to objects simply for the sake of sharing. The difference between humans and our closest living relatives in this regard may be quantitative rather than qualitative, with ramifications for our understanding of the evolution of human social cognition.


Subject(s)
Hominidae , Pan troglodytes , Female , Humans , Animals , Gestures , Animal Communication , Mothers
3.
Anim Cogn ; 25(6): 1393-1398, 2022 Dec.
Article in English | MEDLINE | ID: mdl-35595881

ABSTRACT

The human auditory system is capable of processing human speech even in situations when it has been heavily degraded, such as during noise-vocoding, when frequency domain-based cues to phonetic content are strongly reduced. This has contributed to arguments that speech processing is highly specialized and likely a de novo evolved trait in humans. Previous comparative research has demonstrated that a language competent chimpanzee was also capable of recognizing degraded speech, and therefore that the mechanisms underlying speech processing may not be uniquely human. However, to form a robust reconstruction of the evolutionary origins of speech processing, additional data from other closely related ape species is needed. Specifically, such data can help disentangle whether these capabilities evolved independently in humans and chimpanzees, or if they were inherited from our last common ancestor. Here we provide evidence of processing of highly varied (degraded and computer-generated) speech in a language competent bonobo, Kanzi. We took advantage of Kanzi's existing proficiency with touchscreens and his ability to report his understanding of human speech through interacting with arbitrary symbols called lexigrams. Specifically, we asked Kanzi to recognise both human (natural) and computer-generated forms of 40 highly familiar words that had been degraded (noise-vocoded and sinusoidal forms) using a match-to-sample paradigm. Results suggest that-apart from noise-vocoded computer-generated speech-Kanzi recognised both natural and computer-generated voices that had been degraded, at rates significantly above chance. Kanzi performed better with all forms of natural voice speech compared to computer-generated speech. This work provides additional support for the hypothesis that the processing apparatus necessary to deal with highly variable speech, including for the first time in nonhuman animals, computer-generated speech, may be at least as old as the last common ancestor we share with bonobos and chimpanzees.


Subject(s)
Hominidae , Pan paniscus , Speech Perception , Animals , Humans , Acoustic Stimulation/veterinary , Computers , Pan troglodytes , Speech
4.
PLoS One ; 16(7): e0255241, 2021.
Article in English | MEDLINE | ID: mdl-34297777

ABSTRACT

Joint attention, or sharing attention with another individual about an object or event, is a critical behaviour that emerges in pre-linguistic infants and predicts later language abilities. Given its importance, it is perhaps surprising that there is no consensus on how to measure joint attention in prelinguistic infants. A rigorous definition proposed by Siposova & Carpenter (2019) requires the infant and partner to gaze alternate between an object and each other (coordination of attention) and exchange communicative signals (explicit acknowledgement of jointly sharing attention). However, Hobson and Hobson (2007) proposed that the quality of gaze between individuals is, in itself, a sufficient communicative signal that demonstrates sharing of attention. They proposed that observers can reliably distinguish "sharing", "checking", and "orienting" looks, but the empirical basis for this claim is limited as their study focussed on two raters examining looks from 11-year-old children. Here, we analysed categorisations made by 32 naïve raters of 60 infant looks to their mothers, to examine whether they could be reliably distinguished according to Hobson and Hobson's definitions. Raters had overall low agreement and only in 3 out of 26 cases did a significant majority of the raters agree with the judgement of the mother who had received the look. For the looks that raters did agree on at above chance levels, look duration and the overall communication rate of the mother were identified as cues that raters may have relied upon. In our experiment, naïve third party observers could not reliably determine the type of look infants gave to their mothers, which indicates that subjective judgements of types of look should not be used to identify mutual awareness of sharing attention in infants. Instead, we advocate the use of objective behaviour measurement to infer that interactants know they are 'jointly' attending to an object or event, and believe this will be a crucial step in understanding the ontogenetic and evolutionary origins of joint attention.


Subject(s)
Attention , Eye Movements , Infant Behavior , Mother-Child Relations , Psychological Tests/standards , Adult , Child , Child Development , Female , Humans , Infant , Male , Middle Aged , Observer Variation
5.
Philos Trans R Soc Lond B Biol Sci ; 375(1789): 20180403, 2020 01 06.
Article in English | MEDLINE | ID: mdl-31735155

ABSTRACT

Despite important similarities having been found between human and animal communication systems, surprisingly little research effort has focussed on whether the cognitive mechanisms underpinning these behaviours are also similar. In particular, it is highly debated whether signal production is the result of reflexive processes, or can be characterized as intentional. Here, we critically evaluate the criteria that are used to identify signals produced with different degrees of intentionality, and discuss recent attempts to apply these criteria to the vocal, gestural and multimodal communicative signals of great apes and more distantly related species. Finally, we outline the necessary research tools, such as physiologically validated measures of arousal, and empirical evidence that we believe would propel this debate forward and help unravel the evolutionary origins of human intentional communication. This article is part of the theme issue 'What can animal communication teach us about human language?'


Subject(s)
Animal Communication , Gestures , Hominidae/physiology , Animals , Behavior, Animal , Biological Evolution , Humans , Language , Primates
SELECTION OF CITATIONS
SEARCH DETAIL
...