Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 26
Filter
Add more filters










Publication year range
2.
Brain Lang ; 66(2): 233-48, 1999 Feb 01.
Article in English | MEDLINE | ID: mdl-10190988

ABSTRACT

Previous findings have demonstrated that hemispheric organization in deaf users of American Sign Language (ASL) parallels that of the hearing population, with the left hemisphere showing dominance for grammatical linguistic functions and the right hemisphere showing specialization for non-linguistic spatial functions. The present study addresses two further questions: first, do extra-grammatical discourse functions in deaf signers show the same right-hemisphere dominance observed for discourse functions in hearing subjects; and second, do discourse functions in ASL that employ spatial relations depend upon more general intact spatial cognitive abilities? We report findings from two right-hemisphere damaged deaf signers, both of whom show disruption of discourse functions in absence of any disruption of grammatical functions. The exact nature of the disruption differs for the two subjects, however. Subject AR shows difficulty in maintaining topical coherence, while SJ shows difficulty in employing spatial discourse devices. Further, the two subjects are equally impaired on non-linguistic spatial tasks, indicating that spared spatial discourse functions can occur even when more general spatial cognition is disrupted. We conclude that, as in the hearing population, discourse functions involve the right hemisphere; that distinct discourse functions can be dissociated from one another in ASL; and that brain organization for linguistic spatial devices is driven by its functional role in language processing, rather than by its surface, spatial characteristics.


Subject(s)
Brain Diseases/complications , Communication Disorders/etiology , Deafness/complications , Functional Laterality/physiology , Sign Language , Aged , Communication Disorders/diagnosis , Humans , Male
3.
Cognition ; 68(3): 221-46, 1998 Sep.
Article in English | MEDLINE | ID: mdl-9852666

ABSTRACT

American sign language (ASL) uses space itself to encode spatial information. Spatial scenes are most often described from the perspective of the person signing (the 'narrator'), such that the viewer must perform what amounts to a 180 degrees mental rotation to correctly comprehend the description. But scenes can also be described, non-canonically, from the viewer's perspective, in which case no rotation is required. Is mental rotation during sign language processing difficult for ASL signers? Are there differences between linguistic and non-linguistic mental rotation? Experiment 1 required subjects to decide whether a signed description matched a room presented on videotape. Deaf ASL signers were more accurate when viewing scenes described from the narrator's perspective (even though rotation is required) than from the viewer's perspective (no rotation required). In Experiment 2, deaf signers and hearing non-signers viewed videotapes of objects appearing briefly and sequentially on a board marked with an entrance. This board either matched an identical board in front of the subject or was rotated 180 degrees. Subjects were asked to place objects on their board in the orientation and location shown on the video, making the appropriate rotation when required. All subjects were significantly less accurate when rotation was required, but ASL signers performed significantly better than hearing non-signers under rotation. ASL signers were also more accurate in remembering object orientation. Signers then viewed a video in which the same scenes were signed from the two perspectives (i.e. rotation required or no rotation required). In contrast to their performance with real objects, signers did not show the typical mental rotation effect. Males outperformed females on the rotation task with objects, but the superiority disappeared in the linguistic condition. We discuss the nature of the ASL mental rotation transformation, and we conclude that habitual use of ASL can enhance non-linguistic cognitive processes thus providing evidence for (a form of) the linguistic relativity hypothesis.


Subject(s)
Mental Processes/physiology , Rotation , Sign Language , Adult , Deafness , Female , Humans , Linguistics , Male , United States
4.
Trends Cogn Sci ; 2(4): 129-36, 1998 Apr 01.
Article in English | MEDLINE | ID: mdl-21227109

ABSTRACT

To what extent is the neural organization of language dependent on factors specific to the modalities in which language is perceived and through which it is produced? That is, is the left-hemisphere dominance for language a function of a linguistic specialization or a function of some domain-general specialization(s), such as temporal processing or motor planning? Investigations of the neurobiology of signed language can help answer these questions. As with spoken languages, signed languages of the deaf display complex grammatical structure but are perceived and produced via radically different modalities. Thus, by mapping out the neurological similarities and differences between signed and spoken language, it is possible to identify modality-specific contributions to brain organization for language. Research to date has shown a significant degree of similarity in the neurobiology of signed and spoken languages, suggesting that the neural organization of language is largely modality-independent.

6.
J Deaf Stud Deaf Educ ; 2(3): 150-60, 1997.
Article in English | MEDLINE | ID: mdl-15579844

ABSTRACT

Deaf children who are native users of American Sign Language (ASL) and hearing children who are native English speakers performed three working memory tasks. Results indicate that language modality shapes the architecture of working memory. Digit span with forward and backward report, performed by each group in their native language, suggests that the language rehearsal mechanisms for spoken language and for sign language differ in their processing constraints. Unlike hearing children, deaf children who are native signers of ASL were as good at backward recall of digits as at forward recall, suggesting that serial order information for ASL is stored in a form that does not have a preferred directionality. Data from a group of deaf children who were not native signers of ASL rule out explanations in terms of a floor effect or a nonlinguistic visual strategy. Further, deaf children who were native signers outperformed hearing children on a nonlinguistic spatial memory task, suggesting that language expertise in a particular modality exerts an influence on nonlinguistic working memory within that modality. Thus, language modality has consequences for the structure of working memory, both within and outside the linguistic domain.

8.
Nature ; 381(6584): 699-702, 1996 Jun 20.
Article in English | MEDLINE | ID: mdl-8649515

ABSTRACT

The left cerebral hemisphere is dominant for language, and many aspects of language use are more impaired by damage to the left than the right hemisphere. The basis for this asymmetry, however, is a matter of debate; the left hemisphere may be specialized for processing linguistic information or for some more general function on which language depends, such as the processing of rapidly changing temporal information or execution of complex motor patterns. To investigate these possibilities, we examined the linguistic abilities of 23 sign-language users with unilateral brain lesions. Despite the fact that sign language relies on visuospatial rather than rapid temporal information, the same left-hemispheric dominance emerged. Correlation analyses of the production of sign language versus non-linguistic hand gestures suggest that these processes are largely independent. Our findings support the view that the left-hemisphere dominance for language is not reducible solely to more general sensory or motor processes.


Subject(s)
Brain/physiology , Functional Laterality/physiology , Language , Sign Language , Adult , Aged , Aged, 80 and over , Brain/physiopathology , Cerebral Infarction/physiopathology , Female , Humans , Male , Middle Aged
9.
Neuropsychologia ; 33(12): 1597-606, 1995 Dec.
Article in English | MEDLINE | ID: mdl-8745117

ABSTRACT

We report on a right-handed, deaf, life long signer who suffered a left posterior cerebral artery (PCA) stroke. The patient presented with right homonymous hemianopia, alexia and a severe sign comprehension deficit. Her production of sign language was, however, virtually normal. We suggest that her syndrome can be characterized as a case of 'sign blindness', a disconnection of the intact right hemisphere visual areas from intact left hemisphere language areas. This case provides strong evidence that the neural systems supporting sign language processing are predominantly in the left hemisphere, but also suggests that there are some differences in the neural organization of signed vs spoken language within the left hemisphere.


Subject(s)
Functional Laterality/physiology , Hemianopsia/physiopathology , Occipital Lobe/physiopathology , Sign Language , Space Perception , Female , Hemianopsia/diagnosis , Humans , Middle Aged , Temporal Lobe/physiopathology , Tomography, X-Ray Computed
10.
J Cogn Neurosci ; 7(2): 196-208, 1995.
Article in English | MEDLINE | ID: mdl-23961824

ABSTRACT

Abstract Many species can respond to the behavior of their conspecifics. Human children, and perhaps some nonhuman primates, also have the capacity to respond to the mental states of their conspecifics, i.e., they have a "theory of mind." On the basis of previous research on the theory-of-mind impairment in people with autism, together with animal models of intentionality, Brothers and Ring (1992) postulated a broad cognitive module whose function is to build representations of other individuals. We evaluate the details of this hypothesis through a series of experiments on language, face processing, and theory of mind carried out with subjects with Williams syndrome, a rare genetic neurodevelopmental disorder resulting in an uneven lin-guisticocognitive profile. The results are discussed in terms of how the comparison of different phenotypes (e.g., Williams syndrome, Down syndrome, autism, and hydrocephaly with associated myelomeningocele) can contribute both to understanding the neuropsychology of social cognition and to current thinking about the purported modularity of the brain.

11.
Neuropsychologia ; 30(4): 329-40, 1992 Apr.
Article in English | MEDLINE | ID: mdl-1376447

ABSTRACT

Three severely aphasic hearing patients with no prior knowledge of sign language were able to acquire competency in aspects of American Sign Language (ASL) lexicon and finger spelling, in contrast to a near complete inability to speak the English counterparts of these visuo-gestural signs. Two patients with damage in left postero-lateral temporal and inferior parietal cortices mastered production and comprehension of single signs and short meaningful sign sequences, but the one patient with damage to virtually all left temporal cortices was less accurate in single sign processing and was unable to produce sequences of signs at all. These findings suggest that conceptual knowledge is represented independently of the auditory-vocal records for the corresponding lexical entries, and that left anterior temporal cortices outside of traditional "language areas" are part of the neural network which supports the linkage between conceptual knowledge and linguistic signs, especially as they are used in the sequenced activations required for production or comprehension of meaningful sentences.


Subject(s)
Aphasia/rehabilitation , Brain Damage, Chronic/rehabilitation , Dominance, Cerebral , Mental Recall , Sign Language , Adult , Aphasia/psychology , Brain Damage, Chronic/psychology , Brain Mapping , Cerebral Infarction/psychology , Cerebral Infarction/rehabilitation , Encephalitis/psychology , Encephalitis/rehabilitation , Female , Herpes Simplex/psychology , Herpes Simplex/rehabilitation , Humans , Magnetic Resonance Imaging , Male , Middle Aged , Neuropsychological Tests
12.
Arch Biochem Biophys ; 288(1): 131-40, 1991 Jul.
Article in English | MEDLINE | ID: mdl-1654819

ABSTRACT

Transport of large proteins into the nucleus requires both a nuclear localization signal (NLS) and exposure of that signal to components of the transport machinery. In this report, polyclonal and monoclonal antibodies were generated against the NLS of SV40 large T antigen. Several of these antibodies immunoprecipitated large T antigen produced by in vitro transcription-translation and recognized T antigen expressed in cultured cells. Binding of the antibodies to T antigen was quantified using an indirect radioimmunoassay and found to be specifically inhibited by peptides corresponding to the T antigen NLS. The ability of NLS-specific antibodies to recognize large T antigen suggests that the NLS is exposed on the surface of T antigen. When one of the NLS-specific monoclonal antibodies was introduced into the cytoplasm of cells expressing T antigen, the antibody remained cytoplasmic. These results suggested either that cytoplasmic components compete for binding to the NLS or that the antibody dissociates from T antigen during transport into the nucleus. When an antibody directed against an epitope distinct from the NLS was microinjected into the cytoplasm of cells expressing large T antigen, both the antibody and antigen were transported into the nucleus. The observed stability of the antigen-antibody complex strongly suggest protein unfolding is not required for nuclear protein transport.


Subject(s)
Antibodies, Viral , Antigens, Viral, Tumor/immunology , Simian virus 40/immunology , Amino Acid Sequence , Animals , Antibodies, Monoclonal/administration & dosage , Antibodies, Monoclonal/metabolism , Antibodies, Viral/administration & dosage , Antibodies, Viral/metabolism , Antigens, Viral, Tumor/genetics , Antigens, Viral, Tumor/metabolism , Biological Transport, Active , Cell Line , Cell Nucleus/immunology , Cytoplasm/immunology , Mice , Microinjections , Molecular Sequence Data , Oligopeptides/chemistry , Oligopeptides/immunology , Protein Biosynthesis , Transcription, Genetic
14.
Trends Neurosci ; 12(10): 380-8, 1989 Oct.
Article in English | MEDLINE | ID: mdl-2479135

ABSTRACT

Studies of the signed languages of deaf people have shown that fully expressive languages can arise, outside of the mainstream of spoken languages, that exhibit the complexities of linguistic organization found in all spoken languages. Thus, the human capacity for language is not linked to some privileged cognitive-auditory connection. However, the formal properties of languages (spoken or signed) appear to be highly conditioned by the modalities involved in their perception and production. Multi-layering of linguistic elements and the use of space in the service of syntax appear to be modality-determined aspects of signed languages. Analyses of patterns of breakdown of signed languages provide new perspectives on the nature of cerebral organization for language. The studies reviewed in this article show that the left cerebral hemisphere in man is specialized for signed as well as spoken languages, and thus may have an innate predisposition for language, independent of language modality.


Subject(s)
Brain/physiology , Functional Laterality/physiology , Language Development , Manual Communication , Sign Language , Spatial Behavior , Adult , Aged , Aged, 80 and over , Brain Diseases/physiopathology , Female , Humans , Male , Middle Aged
15.
Article in English | MEDLINE | ID: mdl-2451852

ABSTRACT

Analysis of the patterns of breakdown of a visuospatial language in deaf signers thus allows new perspectives on the nature and determinants of cerebral specialization for language. First, these data show that hearing and speech are not necessary for the development of hemispheric specialization--sound is not crucial. Second, the data show that in these deaf signers, it is the left hemisphere that is dominant for sign language. The patients with damage to the left hemisphere showed marked sign language deficits but relatively intact capacity for processing nonlanguage visuospatial relations. The patients with damage to the right hemisphere showed much the reverse pattern. Thus, not only is there left hemisphere specialization for language functioning, there is a complementary right hemisphere specialization for visuospatial functioning. The fact that much of the grammatical information is conveyed via spatial manipulation appears not to alter this complementary specialization. Furthermore, the finding that components of sign language (e.g., lexicon and grammar) can be selectively impaired suggests that the functional organization of the brain for sign language may turn out to be modular. Finally, patients with left and right hemisphere damage showed dissociations between two uses of space in the language--one to represent spatial relations and the other to represent syntactic relations. Right hemisphere damage disrupts the former but spares the latter; left hemisphere damage disrupts the use of space for syntactic relations but spares its use for spatial relations. Taken together with studies of the processing of sign language "on line" by neurologically intact deaf signers, these data suggest that the left cerebral hemisphere in humans may have an innate predisposition for language, independent of language modality. Studies of the effects of brain damage on signing make it clear that accounts of hemispheric specialization are oversimplified if stated only in terms of a dichotomy between language and visuospatial functioning. Such studies may also permit us to come closer to the real principles underlying the specializations of the two cerebral hemispheres, since in sign language there is interplay between visuospatial and linguistic relations within the same system.


Subject(s)
Brain Damage, Chronic/physiopathology , Brain/physiopathology , Dominance, Cerebral/physiology , Manual Communication , Sign Language , Aphasia/physiopathology , Apraxias/physiopathology , Deafness/physiopathology , Functional Laterality/physiology , Humans , Orientation/physiology , Psychomotor Performance/physiology , Semantics
16.
J Bacteriol ; 169(2): 856-63, 1987 Feb.
Article in English | MEDLINE | ID: mdl-2433267

ABSTRACT

The survival of Salmonella montevideo during serum treatment depends on the presence of an O antigen (O-Ag) associated with the lipopolysaccharide molecule. In this organism, the O antigen is a polysaccharide composed of 0 to more than 55 subunits, each containing 4 mannose residues together with glucose and n-acetylglucosamine. We used a mutant strain of S. montevideo that requires exogenous mannose for the synthesis of O-Ag. Lipopolysaccharide (LPS) was prepared from these cells grown under three different conditions where the availability of exogenous mannose was regulated such that the average number of O-Ag units per LPS molecule, the percentage of LPS molecules bearing long O-Ag side chains, and the percentage of lipid A cores bearing O-Ag were all varied. These changes in LPS profiles were monitored on sodium dodecyl sulfate-polyacrylamide gels, and cells with different LPS profiles were tested for their ability to survive treatment with pooled normal human serum. Survival in serum was associated with LPS that contained an average of 4 to 5 O-Ag units per LPS molecule, and 20 to 23% of the LPS molecules had more than 14 O-Ag units per LPS molecule. Serum survival was less clearly associated with the percentage of lipid A cores covered with O-Ag. We propose, based on these data and on previous work, that the O-Ag polysaccharide provides the cell protection from serum killing by sterically hindering access of the C5b-9 complex to the outer membrane and that a critical density of long O-Ag polysaccharide is necessary to provide protection.


Subject(s)
Antigens, Bacterial/analysis , Blood/microbiology , Lipopolysaccharides/analysis , Salmonella/immunology , Culture Media , Humans , Kinetics , Mannose/metabolism , O Antigens , Phosphorus Radioisotopes , Salmonella/growth & development
17.
Hum Neurobiol ; 2(3): 155-70, 1983.
Article in English | MEDLINE | ID: mdl-6668233

ABSTRACT

Since sign language conveys many grammatical relations by manipulating spatial relations, its study provides a unique opportunity to investigate cerebral specialization for language. Three deaf signers with damage to the left hemisphere were administered an array of formal sign language tests and a linguistic analysis of their spontaneous signing was performed. All three signers showed aphasia for sign language. Strikingly, in these patients, differential damage within the left hemisphere appeared to lead to selective impairment of the structural layers of sign language (e.g. lexicon versus grammar). These data provide the first demonstration of grammatical breakdown in sign language. Importantly, the language impairments of these patients stood in marked contrast to their relatively intact capacities to process nonlanguage visual-spatial relationships. These results suggest that the two cerebral hemispheres of deaf signers can develop separate specializations for linguistic and for visual-spatial processing, even for a visual-spatial language. They further suggest that the left hemisphere has an innate predisposition for language.


Subject(s)
Aphasia/psychology , Dominance, Cerebral/physiology , Manual Communication , Sign Language , Adult , Aged , Brain Damage, Chronic/psychology , Female , Humans , Linguistics , Male , Neuropsychological Tests , Spatial Behavior , Visual Perception/physiology
18.
Int Arch Allergy Appl Immunol ; 68(1): 60-9, 1982.
Article in English | MEDLINE | ID: mdl-6978853

ABSTRACT

Serum concentrations of anti-DNA and anti-deoxyribonucleoprotein (NP) antibodies were measured in parallel by standardized ELISA methods with a polyvalent anti-immunoglobulin conjugate in patients with systemic lupus erythematosus (SLE), Sjögren's syndrome (SS) and rheumatoid arthritis (RA). High levels of these antibodies predominated in systemic lupus erythematosus. While an appreciable incidence of antibodies also occurred in SS and RA, they were mostly at lower levels. By using heavy chain-specific anti-immunoglobulin conjugates, IgG antibodies to both DNA and NP were found in SLE more frequently and at higher levels than were IgM antibodies. In contrast, IgM antibodies to DNA and NP predominated in SS and RA. The immunoglobulin class of the anti-DNA and anti-NP responses in a given SLE patient were not infrequently different. For example, a patient might show a very high IgG but low IgM anti-DNA value, with the reverse being true for anti-NP. IgG anti-DNA antibodies were significantly associated with depressions of C3. During changes in SLE serology, normalization of DNA binding by Farr radioimmunoassay and/or complement was most frequently associated with normalization of the IgG anti-DNA antibody concentrations. In patients simultaneously possessing elevated levels of anti-DNA, anti-NP and rheumatoid factor (RF), absorption with aggregated human IgG usually decreased only the RF activity. In some, however, such absorption decreased all three antibody values simultaneously. The latter findings support observations that some RF possess antinuclear properties.


Subject(s)
Autoantibodies/analysis , DNA/immunology , Deoxyribonucleoproteins/immunology , Nucleoproteins/immunology , Rheumatoid Factor/analysis , Antibodies, Anti-Idiotypic , Antibody Specificity , Arthritis, Rheumatoid/immunology , Autoantibodies/classification , Binding, Competitive , Enzyme-Linked Immunosorbent Assay , Humans , Immunoglobulin G/analysis , Immunoglobulin M/analysis , Lupus Erythematosus, Systemic/immunology , Sjogren's Syndrome/immunology
20.
J Exp Psychol Hum Learn ; 7(6): 464-74, 1981 Nov.
Article in English | MEDLINE | ID: mdl-7328396

ABSTRACT

A series of unordered recall tasks was administered to groups of congenitally deaf subjects for whom American Sign Language (ASL) is the principal means of communication. A suffix effect was observed when an ASL sign was suffixed to a list of ASL signs (Experiment 1), and when a line drawing of an ASL sign was suffixed to a list of line drawings of ASL signs (Experiment 3). The suffix effect was a diminished magnitude when a printed English word was suffixed to a list of printed words (Experiment 2). The findings of Experiment 1 and 3 argue conclusively against the suffix effect resulting solely from sensory store differences. Additionally, the results of Experiment 3 argue conclusively against explanation of the effect as arising solely from differences in the processing of "static" versus "changing-state" input.


Subject(s)
Deafness/congenital , Manual Communication , Memory , Mental Recall , Sign Language , Deafness/psychology , Female , Humans , Male , Reading , Serial Learning
SELECTION OF CITATIONS
SEARCH DETAIL
...