Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
Add more filters










Publication year range
1.
Front Psychol ; 12: 658189, 2021.
Article in English | MEDLINE | ID: mdl-34867572

ABSTRACT

The present study examined differences between inflectional and derivational morphology using Greek nouns and verbs with masked priming (with both short and long stimulus onset asynchrony) and long-lag priming. A lexical decision task to inflected noun and verb targets was used to test whether their processing is differentially facilitated by prior presentation of their stem in words of the same grammatical class (inflectional morphology) or of a different grammatical class (derivational morphology). Differences in semantics, syntactic information, and morphological complexity between inflected and derived word pairs (both nouns and verbs) were minimized by unusually tight control of stimuli as permitted by Greek morphology. Results showed that morphological relations affected processing of morphologically complex Greek words (nouns and verbs) across prime durations (50-250ms) as well as when items intervened between primes and targets. In two of the four experiments (Experiments 1 and 3), inflectionally related primes produced significantly greater effects than derivationally related primes suggesting differences in processing inflectional versus derivational morphological relations, which may disappear when processing is less dependent on semantic effects (Experiment 4). Priming effects differed for verb vs. noun targets with long SOA priming (Experiment 3), consistent with processing differences between complex words of different grammatical class (nouns and verbs) when semantic effects are maximized. Taken together, results demonstrate that inflectional and derivational relations differentially affect processing complex words of different grammatical class (nouns and verbs). This finding indicates that distinctions of morphological relation (inflectional vs. derivational) are not of the same kind as distinctions of grammatical class (nouns vs. verbs). Asymmetric differences among inflected and derived verbs and nouns seem to depend on semantic effects and/or processing demands modulating priming effects very early in lexical processing of morphologically complex written words, consistent with models of lexical processing positing early access to morphological structure and early influence of semantics.

2.
Front Hum Neurosci ; 13: 374, 2019.
Article in English | MEDLINE | ID: mdl-31695602

ABSTRACT

Sign languages are natural languages in the visual domain. Because they lack a written form, they provide a sharper tool than spoken languages for investigating lexicality effects which may be confounded by orthographic processing. In a previous study, we showed that the neural networks supporting phoneme monitoring in deaf British Sign Language (BSL) users are modulated by phonology but not lexicality or iconicity. In the present study, we investigated whether this pattern generalizes to deaf Swedish Sign Language (SSL) users. British and SSLs have a largely overlapping phoneme inventory but are mutually unintelligible because lexical overlap is small. This is important because it means that even when signs lexicalized in BSL are unintelligible to users of SSL they are usually still phonologically acceptable. During fMRI scanning, deaf users of the two different sign languages monitored signs that were lexicalized in either one or both of those languages for phonologically contrastive elements. Neural activation patterns relating to different linguistic levels of processing were similar across SLs; in particular, we found no effect of lexicality, supporting the notion that apparent lexicality effects on sublexical processing of speech may be driven by orthographic strategies. As expected, we found an effect of phonology but not iconicity. Further, there was a difference in neural activation between the two groups in a motion-processing region of the left occipital cortex, possibly driven by cultural differences, such as education. Importantly, this difference was not modulated by the linguistic characteristics of the material, underscoring the robustness of the neural activation patterns relating to different linguistic levels of processing.

3.
Mem Cognit ; 44(4): 608-20, 2016 May.
Article in English | MEDLINE | ID: mdl-26800983

ABSTRACT

Working memory (WM) for spoken language improves when the to-be-remembered items correspond to preexisting representations in long-term memory. We investigated whether this effect generalizes to the visuospatial domain by administering a visual n-back WM task to deaf signers and hearing signers, as well as to hearing nonsigners. Four different kinds of stimuli were presented: British Sign Language (BSL; familiar to the signers), Swedish Sign Language (SSL; unfamiliar), nonsigns, and nonlinguistic manual actions. The hearing signers performed better with BSL than with SSL, demonstrating a facilitatory effect of preexisting semantic representation. The deaf signers also performed better with BSL than with SSL, but only when WM load was high. No effect of preexisting phonological representation was detected. The deaf signers performed better than the hearing nonsigners with all sign-based materials, but this effect did not generalize to nonlinguistic manual actions. We argue that deaf signers, who are highly reliant on visual information for communication, develop expertise in processing sign-based items, even when those items do not have preexisting semantic or phonological representations. Preexisting semantic representation, however, enhances the quality of the gesture-based representations temporarily maintained in WM by this group, thereby releasing WM resources to deal with increased load. Hearing signers, on the other hand, may make strategic use of their speech-based representations for mnemonic purposes. The overall pattern of results is in line with flexible-resource models of WM.


Subject(s)
Deafness/physiopathology , Memory, Short-Term/physiology , Semantics , Sign Language , Adult , Humans , Middle Aged , Space Perception/physiology , Visual Perception/physiology
4.
Neuroimage ; 128: 328-341, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26806289

ABSTRACT

In this study predictions of the dual-route cascaded (DRC) model of word reading were tested using fMRI. Specifically, patterns of co-localization were investigated: (a) between pseudoword length effects and a pseudowords vs. fixation contrast, to reveal the sublexical grapho-phonemic conversion (GPC) system; and (b) between word frequency effects and a words vs. pseudowords contrast, to reveal the orthographic and phonological lexicon. Forty four native speakers of Greek were scanned at 3T in an event-related lexical decision task with three event types: (a) 150 words in which frequency, length, bigram and syllable frequency, neighborhood, and orthographic consistency were decorrelated; (b) 150 matched pseudowords; and (c) fixation. Whole-brain analysis failed to reveal the predicted co-localizations. Further analysis with participant-specific regions of interest defined within masks from the group contrasts revealed length effects in left inferior parietal cortex and frequency effects in the left middle temporal gyrus. These findings could be interpreted as partially consistent with the existence of the GPC system and phonological lexicon of the model, respectively. However, there was no evidence in support of an orthographic lexicon, weakening overall support for the model. The results are discussed with respect to the prospect of using neuroimaging in cognitive model evaluation.


Subject(s)
Brain/physiology , Pattern Recognition, Visual/physiology , Reading , Adult , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Middle Aged , Recognition, Psychology/physiology , Young Adult
5.
J Cogn Neurosci ; 28(1): 20-40, 2016 Jan.
Article in English | MEDLINE | ID: mdl-26351993

ABSTRACT

The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production.


Subject(s)
Brain Mapping , Cerebral Cortex/physiopathology , Perception/physiology , Phonetics , Adult , Analysis of Variance , Cerebral Cortex/blood supply , Cues , Deafness/pathology , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Middle Aged , Oxygen/blood , Photic Stimulation , Psychoacoustics , Reaction Time/physiology , Semantics
6.
Neuroimage ; 124(Pt A): 96-106, 2016 Jan 01.
Article in English | MEDLINE | ID: mdl-26348556

ABSTRACT

Sensory cortices undergo crossmodal reorganisation as a consequence of sensory deprivation. Congenital deafness in humans represents a particular case with respect to other types of sensory deprivation, because cortical reorganisation is not only a consequence of auditory deprivation, but also of language-driven mechanisms. Visual crossmodal plasticity has been found in secondary auditory cortices of deaf individuals, but it is still unclear if reorganisation also takes place in primary auditory areas, and how this relates to language modality and auditory deprivation. Here, we dissociated the effects of language modality and auditory deprivation on crossmodal plasticity in Heschl's gyrus as a whole, and in cytoarchitectonic region Te1.0 (likely to contain the core auditory cortex). Using fMRI, we measured the BOLD response to viewing sign language in congenitally or early deaf individuals with and without sign language knowledge, and in hearing controls. Results show that differences between hearing and deaf individuals are due to a reduction in activation caused by visual stimulation in the hearing group, which is more significant in Te1.0 than in Heschl's gyrus as a whole. Furthermore, differences between deaf and hearing groups are due to auditory deprivation, and there is no evidence that the modality of language used by deaf individuals contributes to crossmodal plasticity in Heschl's gyrus.


Subject(s)
Auditory Cortex/physiopathology , Deafness/physiopathology , Neuronal Plasticity , Sign Language , Adult , Brain Mapping , Echo-Planar Imaging , Female , Humans , Linguistics , Male , Middle Aged
7.
Q J Exp Psychol (Hove) ; 68(4): 641-63, 2015.
Article in English | MEDLINE | ID: mdl-25628070

ABSTRACT

This study asks how users of British Sign Language (BSL) recognize individual signs in connected sign sequences. We examined whether this is achieved through modality-specific or modality-general segmentation procedures. A modality-specific feature of signed languages is that, during continuous signing, there are salient transitions between sign locations. We used the sign-spotting task to ask if and how BSL signers use these transitions in segmentation. A total of 96 real BSL signs were preceded by nonsense signs which were produced in either the target location or another location (with a small or large transition). Half of the transitions were within the same major body area (e.g., head) and half were across body areas (e.g., chest to hand). Deaf adult BSL users (a group of natives and early learners, and a group of late learners) spotted target signs best when there was a minimal transition and worst when there was a large transition. When location changes were present, both groups performed better when transitions were to a different body area than when they were within the same area. These findings suggest that transitions do not provide explicit sign-boundary cues in a modality-specific fashion. Instead, we argue that smaller transitions help recognition in a modality-general way by limiting lexical search to signs within location neighbourhoods, and that transitions across body areas also aid segmentation in a modality-general way, by providing a phonotactic cue to a sign boundary. We propose that sign segmentation is based on modality-general procedures which are core language-processing mechanisms.


Subject(s)
Persons With Hearing Impairments/psychology , Recognition, Psychology/physiology , Semantics , Sign Language , Adolescent , Adult , Analysis of Variance , Cues , Female , Humans , Male , Middle Aged , Reaction Time , Young Adult
8.
Nat Commun ; 4: 1473, 2013.
Article in English | MEDLINE | ID: mdl-23403574

ABSTRACT

Disentangling the effects of sensory and cognitive factors on neural reorganization is fundamental for establishing the relationship between plasticity and functional specialization. Auditory deprivation in humans provides a unique insight into this problem, because the origin of the anatomical and functional changes observed in deaf individuals is not only sensory, but also cognitive, owing to the implementation of visual communication strategies such as sign language and speechreading. Here, we describe a functional magnetic resonance imaging study of individuals with different auditory deprivation and sign language experience. We find that sensory and cognitive experience cause plasticity in anatomically and functionally distinguishable substrates. This suggests that after plastic reorganization, cortical regions adapt to process a different type of input signal, but preserve the nature of the computation they perform, both at a sensory and cognitive level.


Subject(s)
Cognition/physiology , Neuronal Plasticity/physiology , Sensation/physiology , Temporal Lobe/physiopathology , Adult , Auditory Threshold/physiology , Deafness/physiopathology , Female , Humans , Language , Male , Middle Aged , Sign Language , Sound , Sweden
9.
Front Psychol ; 4: 942, 2013.
Article in English | MEDLINE | ID: mdl-24379797

ABSTRACT

Similar working memory (WM) for lexical items has been demonstrated for signers and non-signers while short-term memory (STM) is regularly poorer in deaf than hearing individuals. In the present study, we investigated digit-based WM and STM in Swedish and British deaf signers and hearing non-signers. To maintain good experimental control we used printed stimuli throughout and held response mode constant across groups. We showed that deaf signers have similar digit-based WM performance, despite shorter digit spans, compared to well-matched hearing non-signers. We found no difference between signers and non-signers on STM span for letters chosen to minimize phonological similarity or in the effects of recall direction. This set of findings indicates that similar WM for signers and non-signers can be generalized from lexical items to digits and suggests that poorer STM in deaf signers compared to hearing non-signers may be due to differences in phonological similarity across the language modalities of sign and speech.

10.
Cognition ; 124(1): 50-65, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22578601

ABSTRACT

Age of acquisition (AoA) effects have been used to support the notion of a critical period for first language acquisition. In this study, we examine AoA effects in deaf British Sign Language (BSL) users via a grammaticality judgment task. When English reading performance and nonverbal IQ are factored out, results show that accuracy of grammaticality judgement decreases as AoA increases, until around age 8, thus showing the unique effect of AoA on grammatical judgement in early learners. No such effects were found in those who acquired BSL after age 8. These late learners appear to have first language proficiency in English instead, which may have been used to scaffold learning of BSL as a second language later in life.


Subject(s)
Critical Period, Psychological , Education of Hearing Disabled , Language Development , Reading , Sign Language , Adult , Child , Female , Humans , Language , Linguistics , Male , Middle Aged , Regression Analysis
11.
Q J Exp Psychol (Hove) ; 64(1): 96-121, 2011 Jan.
Article in English | MEDLINE | ID: mdl-20509097

ABSTRACT

Two experiments explored repetition priming effects for spoken words and pseudowords in order to investigate abstractionist and episodic accounts of spoken word recognition and repetition priming. In Experiment 1, lexical decisions were made on spoken words and pseudowords with half of the items presented twice (∼12 intervening items). Half of all repetitions were spoken in a "different voice" from the first presentations. Experiment 2 used the same procedure but with stimuli embedded in noise to slow responses. Results showed greater priming for words than for pseudowords and no effect of voice change in both normal and effortful processing conditions. Additional analyses showed that for slower participants, priming is more equivalent for words and pseudowords, suggesting episodic stimulus-response associations that suppress familiarity-based mechanisms that ordinarily enhance word priming. By relating behavioural priming to the time-course of pseudoword identification we showed that under normal listening conditions (Experiment 1) priming reflects facilitation of both perceptual and decision components, whereas in effortful listening conditions (Experiment 2) priming effects primarily reflect enhanced decision/response generation processes. Both stimulus-response associations and enhanced processing of sensory input seem to be voice independent, providing novel evidence concerning the degree of perceptual abstraction in the recognition of spoken words and pseudowords.


Subject(s)
Discrimination, Psychological/physiology , Recognition, Psychology/physiology , Speech Perception/physiology , Verbal Learning/physiology , Vocabulary , Acoustic Stimulation/methods , Adolescent , Adult , Analysis of Variance , Attention , Female , Humans , Male , Photic Stimulation/methods , Reaction Time/physiology , Statistics as Topic , Young Adult
12.
Mem Cognit ; 37(3): 302-15, 2009 Apr.
Article in English | MEDLINE | ID: mdl-19246345

ABSTRACT

Do all components of a sign contribute equally to its recognition? In the present study, misperceptions in the sign-spotting task (based on the word-spotting task; Cutler & Norris, 1988) were analyzed to address this question. Three groups of deaf signers of British Sign Language (BSL) with different ages of acquisition (AoA) saw BSL signs combined with nonsense signs, along with combinations of two nonsense signs. They were asked to spot real signs and report what they had spotted. We will present an analysis of false alarms to the nonsense-sign combinations-that is, misperceptions of nonsense signs as real signs (cf. van Ooijen, 1996). Participants modified the movement and handshape parameters more than the location parameter. Within this pattern, however, there were differences as a function of AoA. These results show that the theoretical distinctions between form-based parameters in sign-language models have consequences for online processing. Vowels and consonants have different roles in speech recognition; similarly, it appears that movement, handshape, and location parameters contribute differentially to sign recognition.


Subject(s)
Language , Phonetics , Recognition, Psychology , Sign Language , Adolescent , Adult , Concept Formation , Deafness/psychology , Deafness/rehabilitation , Discrimination Learning , Female , Humans , Male , Middle Aged , Young Adult
13.
J Cogn Neurosci ; 18(8): 1237-52, 2006 Aug.
Article in English | MEDLINE | ID: mdl-16859411

ABSTRACT

An important method for studying how the brain processes familiar stimuli is to present the same item on more than one occasion and measure how responses change with repetition. Here we use repetition priming in a sparse functional magnetic resonance imaging (fMRI) study to probe the neuroanatomical basis of spoken word recognition and the representations of spoken words that mediate repetition priming effects. Participants made lexical decisions to words and pseudowords spoken by a male or female voice that were presented twice, with half of the repetitions in a different voice. Behavioral and neural priming was observed for both words and pseudowords and was not affected by voice changes. The fMRI data revealed an elevated response to words compared to pseudowords in both posterior and anterior temporal regions, suggesting that both contribute to word recognition. Both reduced and elevated activation for second presentations (repetition suppression and enhancement) were observed in frontal and posterior regions. Correlations between behavioral priming and neural repetition suppression were observed in frontal regions, suggesting that repetition priming effects for spoken words reflect changes within systems involved in generating behavioral responses. Based on the current results, these processes are sufficiently abstract to display priming despite changes in the physical form of the stimulus and operate equivalently for words and pseudowords.


Subject(s)
Brain Mapping , Brain/physiology , Recognition, Psychology/physiology , Repression, Psychology , Speech/physiology , Adolescent , Adult , Analysis of Variance , Brain/blood supply , Discrimination, Psychological , Female , Humans , Image Processing, Computer-Assisted/methods , Magnetic Resonance Imaging/methods , Male , Oxygen/blood , Predictive Value of Tests , Reaction Time/physiology , Sex Factors
14.
Mem Cognit ; 33(2): 355-69, 2005 Mar.
Article in English | MEDLINE | ID: mdl-16028589

ABSTRACT

In two experiments, Greek-English bilinguals alternated between performing a lexical decision task in Greek and in English. The cost to performance on switch trials interacted with response repetition, implying that a source of this "switch cost" is at the level of response mapping or initiation. Orthographic specificity also affected switch cost. Greek and English have partially overlapping alphabets, which enabled us to manipulate language specificity at the letter level, rather than only at the level of letter clusters. Language-nonspecific stimuli used only symbols common to both Greek and English, whereas language-specific stimuli contained letters unique to just one language. The switch cost was markedly reduced by such language-specific orthography, and this effect did not interact with the effect of response repetition, implying a separate, stimulus-sensitive source of switch costs. However, we argue that this second source is not within the word-recognition system, but at the level of task schemas, because the reduction of switch cost with language-specific stimuli was abolished when these stimuli were intermingled with language-nonspecific stimuli.


Subject(s)
Language , Periodicity , Psychomotor Performance , Verbal Behavior , Adult , Female , Humans , Male , Multilingualism , Reaction Time
SELECTION OF CITATIONS
SEARCH DETAIL
...