Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
Add more filters










Database
Language
Publication year range
1.
Cogn Sci ; 46(10): e13203, 2022 10.
Article in English | MEDLINE | ID: mdl-36251421

ABSTRACT

Of the six possible orderings of the three main constituents of language (subject, verb, and object), two-SOV and SVO-are predominant cross-linguistically. Previous research using the silent gesture paradigm in which hearing participants produce or respond to gestures without speech has shown that different factors such as reversibility, salience, and animacy can affect the preferences for different orders. Here, we test whether participants' preferences for orders that are conditioned on the semantics of the event change depending on (i) the iconicity of individual gestural elements and (ii) the prior knowledge of a conventional lexicon. Our findings demonstrate the same preference for semantically conditioned word order found in previous studies, specifically that SOV and SVO are preferred differentially for different types of events. We do not find that iconicity of individual gestures affects participants' ordering preferences; however, we do find that learning a lexicon leads to a stronger preference for SVO-like orders overall. Finally, we compare our findings from English speakers, using an SVO-dominant language, with data from speakers of an SOV-dominant language, Turkish. We find that, while learning a lexicon leads to an increase in SVO preference for both sets of participants, this effect is mediated by language background and event type, suggesting that an interplay of factors together determines preferences for different ordering patterns. Taken together, our results support a view of word order as a gradient phenomenon responding to multiple biases.


Subject(s)
Gestures , Language , Humans , Learning , Semantics , Speech
2.
Cognition ; 228: 105206, 2022 11.
Article in English | MEDLINE | ID: mdl-35810511

ABSTRACT

Silent gesture studies, in which hearing participants from different linguistic backgrounds produce gestures to communicate events, have been used to test hypotheses about the cognitive biases that govern cross-linguistic word order preferences. In particular, the differential use of SOV and SVO order to communicate, respectively, extensional events (where the direct object exists independently of the event; e.g., girl throws ball) and intensional events (where the meaning of the direct object is potentially dependent on the verb; e.g., girl thinks of ball), has been suggested to represent a natural preference, demonstrated in improvisation contexts. However, natural languages tend to prefer systematic word orders, where a single order is used regardless of the event being communicated. We present a series of studies that investigate ordering preferences for SOV and SVO orders using an online forced-choice experiment, where English-speaking participants select orders for different events i) in the absence of conventions and ii) after learning event-order mappings in different frequencies in a regularisation experiment. Our results show that natural ordering preferences arise in the absence of conventions, replicating previous findings from production experiments. In addition, we show that participants regularise the input they learn in the manual modality in two ways, such that, while the preference for systematic order patterns increases through learning, it exists in competition with the natural ordering preference, that conditions order on the semantics of the event. Using our experimental data in a computational model of cultural transmission, we show that this pattern is expected to persist over generations, suggesting that we should expect to see evidence of semantically-conditioned word order variability in at least some languages.


Subject(s)
Language , Linguistics , Female , Gestures , Humans , Language Development , Learning , Semantics
3.
Front Psychol ; 13: 805144, 2022.
Article in English | MEDLINE | ID: mdl-35529568

ABSTRACT

How do cognitive biases and mechanisms from learning and use interact when a system of language conventions emerges? We investigate this question by focusing on how transitive events are conveyed in silent gesture production and interaction. Silent gesture experiments (in which participants improvise to use gesture but no speech) have been used to investigate cognitive biases that shape utterances produced in the absence of a conventional language system. In this mode of communication, participants do not follow the dominant order of their native language (e.g., Subject-Verb-Object), and instead condition the structure on the semantic properties of the events they are conveying. An important source of variability in structure in silent gesture is the property of reversibility. Reversible events typically have two animate participants whose roles can be reversed (girl kicks boy). Without a syntactic/conventional means of conveying who does what to whom, there is inherent unclarity about the agent and patient roles in the event (by contrast, this is less pressing for non-reversible events like girl kicks ball). In experiment 1 we test a novel, fine-grained analysis of reversibility. Presenting a silent gesture production experiment, we show that the variability in word order depends on two factors (properties of the verb and properties of the direct object) that together determine how reversible an event is. We relate our experimental results to principles from information theory, showing that our data support the "noisy channel" account of constituent order. In experiment 2, we focus on the influence of interaction on word order variability for reversible and non-reversible events. We show that when participants use silent gesture for communicative interaction, they become more consistent in their usage of word order over time, however, this pattern less pronounced for events that are classified as strongly non-reversible. We conclude that full consistency in word order is theoretically a good strategy, but word order use in practice is a more complex phenomenon.

4.
Cogn Sci ; 43(7): e12732, 2019 07.
Article in English | MEDLINE | ID: mdl-31310026

ABSTRACT

Natural languages make prolific use of conventional constituent-ordering patterns to indicate "who did what to whom," yet the mechanisms through which these regularities arise are not well understood. A series of recent experiments demonstrates that, when prompted to express meanings through silent gesture, people bypass native language conventions, revealing apparent biases underpinning word order usage, based on the semantic properties of the information to be conveyed. We extend the scope of these studies by focusing, experimentally and computationally, on the interpretation of silent gesture. We show cross-linguistic experimental evidence that people use variability in constituent order as a cue to obtain different interpretations. To illuminate the computational principles that govern interpretation of non-conventional communication, we derive a Bayesian model of interpretation via biased inductive inference and estimate these biases from the experimental data. Our analyses suggest people's interpretations balance the ambiguity that is characteristic of emerging language systems, with ordering preferences that are skewed and asymmetric, but defeasible.


Subject(s)
Cognition , Gestures , Language , Bayes Theorem , Bias , Female , Humans , Male , Semantics
5.
Cognition ; 192: 103964, 2019 11.
Article in English | MEDLINE | ID: mdl-31302362

ABSTRACT

Recent work on emerging sign languages provides evidence for how key properties of linguistic systems are created. Here we use laboratory experiments to investigate the contribution of two specific mechanisms-interaction and transmission-to the emergence of a manual communication system in silent gesturers. We show that the combined effects of these mechanisms, rather than either alone, maintain communicative efficiency, and lead to a gradual increase of regularity and systematic structure. The gestures initially produced by participants are unsystematic and resemble pantomime, but come to develop key language-like properties similar to those documented in newly emerging sign systems.


Subject(s)
Gestures , Linguistics , Sign Language , Adolescent , Adult , Humans , Young Adult
6.
Behav Brain Sci ; 40: e65, 2017 01.
Article in English | MEDLINE | ID: mdl-29342521

ABSTRACT

Understanding the relationship between gesture, sign, and speech offers a valuable tool for investigating how language emerges from a nonlinguistic state. We propose that the focus on linguistic status is problematic, and a shift to focus on the processes that shape these systems serves to explain the relationship between them and contributes to the central question of how language evolves.


Subject(s)
Language Development , Sign Language , Gestures , Humans , Language , Speech
7.
Cogn Sci ; 41 Suppl 4: 928-940, 2017 Apr.
Article in English | MEDLINE | ID: mdl-27859550

ABSTRACT

Many human languages have complex grammatical machinery devoted to temporality, but very little is known about how this came about. This paper investigates how people convey temporal information when they cannot use any conventional languages they know. In a laboratory experiment, adult participants were asked to convey information about simple events taking place at a given time, in spoken language and in silent gesture (i.e., using only gesture and no speech). It was shown that in spoken language, participants formed utterances according to the rules of their native language (Dutch), but in silent gesture, the temporal information was presented initially, and structurally separately, from the other information in the utterance. The experimental results are consistent with findings from natural systems emerging in situations of communicative stress: unsupervised adult second language learning and homesign. This confirms that presenting temporal information separately and initially (directly mirroring how temporal and propositional information can be represented semantically) is a robust strategy to talk about past and future when only sparse communicative means are available.


Subject(s)
Gestures , Language , Vocabulary , Adolescent , Adult , Communication , Female , Humans , Male , Young Adult
8.
Cognition ; 131(3): 431-6, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24704967

ABSTRACT

Where do the different sentence orders in the languages of the world come from? Recently, it has been suggested that there is a basic sentence order, SOV (Subject-Object-Verb), which was the starting point for other sentence orders. Backup for this claim was found in newly emerging languages, as well as in experiments where people are asked to convey simple meanings in improvised gesture production. In both cases, researchers found that the predominant word order is SOV. Recent literature has shown that the pragmatic rule 'Agent first' drives the preference for S initial word order, but this rule does not decide between SOV and SVO. This paper presents experimental evidence for grounding the word order that emerges in gesture production in semantic properties of the message to be conveyed. We focus on the role of the verb, and argue that the preference for SOV word order reported in earlier experiments is due to the use of extensional verbs (e.g. throw). With intensional verbs like think, the object is dependent on the agent's thought, and our experiment confirms that such verbs lead to a preference for SVO instead. We conclude that the meaning of the verb plays a crucial role in the sequencing of utterances in emerging language systems. This finding is relevant for the debate on language evolution, because it suggests that semantics underlies the early formation of syntactic rules.


Subject(s)
Language , Psycholinguistics , Semantics , Female , Gestures , Humans , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...