Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
Cognition ; 249: 105811, 2024 May 21.
Article in English | MEDLINE | ID: mdl-38776621

ABSTRACT

Adults with no knowledge of sign languages can perceive distinctive markers that signal event boundedness (telicity), suggesting that telicity is a cognitively natural semantic feature that can be marked iconically (Strickland et al., 2015). This study asks if non-signing children (5-year-olds) can also link telicity to iconic markers in sign. Experiment 1 attempted three close replications of Strickland et al. (2015) and found only limited success. However, Experiment 2 showed that children can both perceive the relevant visual feature and can succeed at linking the visual property to telicity semantics when allowed to filter their answer through their own linguistic choices. Children's performance demonstrates the cognitive naturalness and early availability of the semantics of telicity, supporting the idea that telicity helps guide the language acquisition process.

2.
Science ; 383(6682): 519-523, 2024 Feb 02.
Article in English | MEDLINE | ID: mdl-38301028

ABSTRACT

Sign languages are naturally occurring languages. As such, their emergence and spread reflect the histories of their communities. However, limitations in historical recordkeeping and linguistic documentation have hindered the diachronic analysis of sign languages. In this work, we used computational phylogenetic methods to study family structure among 19 sign languages from deaf communities worldwide. We used phonologically coded lexical data from contemporary languages to infer relatedness and suggest that these methods can help study regular form changes in sign languages. The inferred trees are consistent in key respects with known historical information but challenge certain assumed groupings and surpass analyses made available by traditional methods. Moreover, the phylogenetic inferences are not reducible to geographic distribution but do affirm the importance of geopolitical forces in the histories of human languages.


Subject(s)
Language , Linguistics , Sign Language , Humans , Language/history , Linguistics/classification , Linguistics/history , Phylogeny
3.
Cognition ; 210: 104596, 2021 05.
Article in English | MEDLINE | ID: mdl-33667973

ABSTRACT

The idea that the form of a word reflects information about its meaning has its roots in Platonic philosophy, and has been experimentally investigated for concrete, sensory-based properties since the early 20th century. Here, we provide evidence for an abstract property of 'boundedness' that introduces a systematic, iconic bias on the phonological expectations of a novel lexicon. We show that this abstract property is general across events and objects. In Experiment 1, we show that subjects are systematically more likely to associate sign language signs that end with a gestural boundary with telic verbs (denoting events with temporal boundaries, e.g., die, arrive) and with count nouns (denoting objects with spatial boundaries, e.g., ball, coin). In Experiments 2-3, we show that this iconic mapping acts on conceptual representations, not on grammatical features. Specifically, the mapping does not carry over to psychological nouns (e.g. people are not more likely to associate a gestural boundary with idea than with knowledge). Although these psychological nouns are still syntactically encoded as either count or mass, they do not denote objects that are conceived of as having spatial boundaries. The mapping bias thus breaks down. Experiments 4-5 replicate these findings with a new set of stimuli. Finally, in Experiments 6-11, we explore possible extensions to a similar bias for spoken language stimuli, with mixed results. Generally, the results here suggest that 'boundedness' of words' referents (in space or time) has a powerful effect on intuitions regarding the form that the words should take.


Subject(s)
Language , Linguistics , Bias , Gestures , Humans , Semantics , Sign Language
4.
Q J Exp Psychol (Hove) ; 71(11): 2325-2333, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30362405

ABSTRACT

In this study, we investigated whether auditory deprivation leads to a more balanced bilateral control of spatial attention in the haptic space. We tested four groups of participants: early deaf, early blind, deafblind, and control (normally hearing and sighted) participants. Using a haptic line bisection task, we found that while normally hearing individuals (even when blind) showed a significant tendency to bisect to the left of the veridical midpoint (i.e., pseudoneglect), deaf individuals did not show any significant directional bias. This was the case of both deaf signers and non-signers, in line with prior findings obtained using a visual line bisection task. Interestingly, deafblind individuals also erred significantly to the left, resembling the pattern of early blind and control participants. Overall, these data critically suggest that deafness induces changes in the hemispheric asymmetry subtending the orientation of spatial attention also in the haptic modality. Moreover, our findings indicate that what counterbalances the right-hemisphere dominance in the control of spatial attention is not the lack of auditory input per se, nor sign language use, but rather the heavier reliance on visual experience induced by early auditory deprivation.


Subject(s)
Bias , Blindness/physiopathology , Deaf-Blind Disorders/physiopathology , Deafness/physiopathology , Functional Laterality/physiology , Space Perception/physiology , Touch Perception/physiology , Adolescent , Adult , Aged , Analysis of Variance , Female , Humans , Male , Middle Aged , Young Adult
5.
Behav Brain Sci ; 40: e72, 2017 01.
Article in English | MEDLINE | ID: mdl-29342537

ABSTRACT

Goldin-Meadow & Brentari (G-M&B) argue that, for sign language users, gesture - in contrast to linguistic sign - is iconic, highly variable, and similar to spoken language co-speech gesture. We discuss two examples (telicity and absolute gradable adjectives) that challenge the use of these criteria for distinguishing sign from gesture.


Subject(s)
Gestures , Sign Language , Humans , Language , Language Development , Speech
6.
Proc Natl Acad Sci U S A ; 112(19): 5968-73, 2015 May 12.
Article in English | MEDLINE | ID: mdl-25918419

ABSTRACT

According to a theoretical tradition dating back to Aristotle, verbs can be classified into two broad categories. Telic verbs (e.g., "decide," "sell," "die") encode a logical endpoint, whereas atelic verbs (e.g., "think," "negotiate," "run") do not, and the denoted event could therefore logically continue indefinitely. Here we show that sign languages encode telicity in a seemingly universal way and moreover that even nonsigners lacking any prior experience with sign language understand these encodings. In experiments 1-5, nonsigning English speakers accurately distinguished between telic (e.g., "decide") and atelic (e.g., "think") signs from (the historically unrelated) Italian Sign Language, Sign Language of the Netherlands, and Turkish Sign Language. These results were not due to participants' inferring that the sign merely imitated the action in question. In experiment 6, we used pseudosigns to show that the presence of a salient visual boundary at the end of a gesture was sufficient to elicit telic interpretations, whereas repeated movement without salient boundaries elicited atelic interpretations. Experiments 7-10 confirmed that these visual cues were used by all of the sign languages studied here. Together, these results suggest that signers and nonsigners share universally accessible notions of telicity as well as universally accessible "mapping biases" between telicity and visual form.


Subject(s)
Linguistics , Sign Language , Communication , Comprehension , Cues , Gestures , Humans , Internet , Italy , Language , Linear Models , United States
7.
J Cogn Neurosci ; 24(2): 276-86, 2012 Feb.
Article in English | MEDLINE | ID: mdl-21916563

ABSTRACT

Confronted with the loss of one type of sensory input, we compensate using information conveyed by other senses. However, losing one type of sensory information at specific developmental times may lead to deficits across all sensory modalities. We addressed the effect of auditory deprivation on the development of tactile abilities, taking into account changes occurring at the behavioral and cortical level. Congenitally deaf and hearing individuals performed two tactile tasks, the first requiring the discrimination of the temporal duration of touches and the second requiring the discrimination of their spatial length. Compared with hearing individuals, deaf individuals were impaired only in tactile temporal processing. To explore the neural substrate of this difference, we ran a TMS experiment. In deaf individuals, the auditory association cortex was involved in temporal and spatial tactile processing, with the same chronometry as the primary somatosensory cortex. In hearing participants, the involvement of auditory association cortex occurred at a later stage and selectively for temporal discrimination. The different chronometry in the recruitment of the auditory cortex in deaf individuals correlated with the tactile temporal impairment. Thus, early hearing experience seems to be crucial to develop an efficient temporal processing across modalities, suggesting that plasticity does not necessarily result in behavioral compensation.


Subject(s)
Brain/physiopathology , Deafness/physiopathology , Time Perception/physiology , Touch Perception/physiology , Touch/physiology , Adult , Discrimination, Psychological/physiology , Female , Humans , Male , Middle Aged , Persons With Hearing Impairments , Transcranial Magnetic Stimulation
8.
Funct Neurol ; 27(3): 177-85, 2012.
Article in English | MEDLINE | ID: mdl-23402679

ABSTRACT

The present study investigates basic numerical processing in deaf signers and hearing individuals by evaluating notational effects (Arabic digits vs Italian sign language number signs) and response modality (manual vs pedal) in a parity judgment task. Overall, a standard SNARC effect emerged in both groups, suggesting similar numerical representation in hearing and deaf individuals. With the exception of Italian sign language stimuli in the hearing group, this effect applied to all stimuli notations and to both response modalities. In line with the special status of signs, the visuospatial complexity of finger configurations (i.e. number of extended fingers) affected the performance of the hearing group to a greater extent. Finally, the SNARC effect emerged systematically across lateralized effectors(manual/pedal response), challenging the hypothesis that the stimulus-response compatibility effect is specific to the effectors associated with the production of written and sign language. As for parity processing, both groups were similarly influenced by the parity information conveyed by the dominant hand, indicating the compositional nature of number signs irrespective of the preferred language modality.


Subject(s)
Deafness , Hand , Hearing , Mathematics , Sign Language , Adolescent , Adult , Analysis of Variance , Female , Foot/physiology , Hand/physiology , Humans , Italy , Judgment/physiology , Male , Space Perception , Symbolism , Visual Perception , Young Adult
9.
J Deaf Stud Deaf Educ ; 16(1): 101-7, 2011.
Article in English | MEDLINE | ID: mdl-20679138

ABSTRACT

Although signed and speech-based languages have a similar internal organization of verbal short-term memory, sign span is lower than word span. We investigated whether this is due to the fact that signs are not suited for serial recall, as proposed by Bavelier, Newport, Hall, Supalla, and Boutla (2008. Ordered short-term memory differs in signers and speakers: Implications for models of short-term memory. Cognition, 107, 433-459). We administered a serial recall task with stimuli in Italian Sign Language to 12 deaf people, and we compared their performance with that of twelve age-, gender-, and education-matched hearing participants who performed the task in Italian. The results do not offer evidence for the hypothesis that serial order per se is a detrimental factor for deaf participants. An alternative explanation for the lower sign span based on signs being phonologically heavier than words is considered.


Subject(s)
Deafness/psychology , Memory, Short-Term , Sign Language , Adult , Cognition , Female , Humans , Italy , Language , Male , Mental Recall , Middle Aged , Serial Learning
10.
Cognition ; 106(2): 780-804, 2008 Feb.
Article in English | MEDLINE | ID: mdl-17537417

ABSTRACT

It is known that in American Sign Language (ASL) span is shorter than in English, but this discrepancy has never been systematically investigated using other pairs of signed and spoken languages. This finding is at odds with results showing that short-term memory (STM) for signs has an internal organization similar to STM for words. Moreover, some methodological questions remain open. Thus, we measured span of deaf and matched hearing participants for Italian Sign Language (LIS) and Italian, respectively, controlling for all the possible variables that might be responsible for the discrepancy: yet, a difference in span between deaf signers and hearing speakers was found. However, the advantage of hearing subjects was removed in a visuo-spatial STM task. We attribute the source of the lower span to the internal structure of signs: indeed, unlike English (or Italian) words, signs contain both simultaneous and sequential components. Nonetheless, sign languages are fully-fledged grammatical systems, probably because the overall architecture of the grammar of signed languages reduces the STM load. Our hypothesis is that the faculty of language is dependent on STM, being however flexible enough to develop even in a relatively hostile environment.


Subject(s)
Language , Memory, Short-Term/physiology , Sign Language , Adult , Deafness/psychology , Female , Humans , Male , Middle Aged , Psycholinguistics
SELECTION OF CITATIONS
SEARCH DETAIL
...