Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 27
Filter
Add more filters










Publication year range
1.
Front Neurosci ; 17: 1218510, 2023.
Article in English | MEDLINE | ID: mdl-37901437

ABSTRACT

Introduction: Sensory inference and top-down predictive processing, reflected in human neural activity, play a critical role in higher-order cognitive processes, such as language comprehension. However, the neurobiological bases of predictive processing in higher-order cognitive processes are not well-understood. Methods: This study used electroencephalography (EEG) to track participants' cortical dynamics in response to Austrian Sign Language and reversed sign language videos, measuring neural coherence to optical flow in the visual signal. We then used machine learning to assess entropy-based relevance of specific frequencies and regions of interest to brain state classification accuracy. Results: EEG features highly relevant for classification were distributed across language processing-related regions in Deaf signers (frontal cortex and left hemisphere), while in non-signers such features were concentrated in visual and spatial processing regions. Discussion: The results highlight functional significance of predictive processing time windows for sign language comprehension and biological motion processing, and the role of long-term experience (learning) in minimizing prediction error.

2.
Rural Spec Educ Q ; 42(2): 105-118, 2023 Jun.
Article in English | MEDLINE | ID: mdl-38602929

ABSTRACT

This position paper explores the needs of rural families of children, adolescents, and adults with autism spectrum disorder (ASD) during the COVID-19 pandemic. Prior to COVID-19, literature portrays elevated stress in families of individuals with ASD and health and socioeconomic disparities for rural and underserved populations. These disparities were exacerbated due to COVID-19 and subsequent lockdowns and economic turmoil. Academic and adaptive skills training were particularly impacted due to school closures, with parents tasked with taking some responsibility for training these skills. Our goals for this article focus on special considerations for rural families regarding (a) neurobiological and developmental impacts of stressful experiences like COVID-19, (b) delineation of the impacts on individuals with ASD and other comorbid and related conditions, and (c) education and intervention needs during these times. Finally, we offer suggestions for future care during pandemic events, including recommendations for improving service delivery under such conditions.

3.
Front Psychol ; 13: 805792, 2022.
Article in English | MEDLINE | ID: mdl-35496220

ABSTRACT

The objective of this article was to review existing research to assess the evidence for predictive processing (PP) in sign language, the conditions under which it occurs, and the effects of language mastery (sign language as a first language, sign language as a second language, bimodal bilingualism) on the neural bases of PP. This review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework. We searched peer-reviewed electronic databases (SCOPUS, Web of Science, PubMed, ScienceDirect, and EBSCO host) and gray literature (dissertations in ProQuest). We also searched the reference lists of records selected for the review and forward citations to identify all relevant publications. We searched for records based on five criteria (original work, peer-reviewed, published in English, research topic related to PP or neural entrainment, and human sign language processing). To reduce the risk of bias, the remaining two authors with expertise in sign language processing and a variety of research methods reviewed the results. Disagreements were resolved through extensive discussion. In the final review, 7 records were included, of which 5 were published articles and 2 were dissertations. The reviewed records provide evidence for PP in signing populations, although the underlying mechanism in the visual modality is not clear. The reviewed studies addressed the motor simulation proposals, neural basis of PP, as well as the development of PP. All studies used dynamic sign stimuli. Most of the studies focused on semantic prediction. The question of the mechanism for the interaction between one's sign language competence (L1 vs. L2 vs. bimodal bilingual) and PP in the manual-visual modality remains unclear, primarily due to the scarcity of participants with varying degrees of language dominance. There is a paucity of evidence for PP in sign languages, especially for frequency-based, phonetic (articulatory), and syntactic prediction. However, studies published to date indicate that Deaf native/native-like L1 signers predict linguistic information during sign language processing, suggesting that PP is an amodal property of language processing. Systematic Review Registration: [https://www.crd.york.ac.uk/prospero/display_record.php?ID=CRD42021238911], identifier [CRD42021238911].

4.
PLoS One ; 17(2): e0262098, 2022.
Article in English | MEDLINE | ID: mdl-35213558

ABSTRACT

Longstanding cross-linguistic work on event representations in spoken languages have argued for a robust mapping between an event's underlying representation and its syntactic encoding, such that-for example-the agent of an event is most frequently mapped to subject position. In the same vein, sign languages have long been claimed to construct signs that visually represent their meaning, i.e., signs that are iconic. Experimental research on linguistic parameters such as plurality and aspect has recently shown some of them to be visually universal in sign, i.e. recognized by non-signers as well as signers, and have identified specific visual cues that achieve this mapping. However, little is known about what makes action representations in sign language iconic, or whether and how the mapping of underlying event representations to syntactic encoding is visually apparent in the form of a verb sign. To this end, we asked what visual cues non-signers may use in evaluating transitivity (i.e., the number of entities involved in an action). To do this, we correlated non-signer judgments about transitivity of verb signs from American Sign Language (ASL) with phonological characteristics of these signs. We found that non-signers did not accurately guess the transitivity of the signs, but that non-signer transitivity judgments can nevertheless be predicted from the signs' visual characteristics. Further, non-signers cue in on just those features that code event representations across sign languages, despite interpreting them differently. This suggests the existence of visual biases that underlie detection of linguistic categories, such as transitivity, which may uncouple from underlying conceptual representations over time in mature sign languages due to lexicalization processes.


Subject(s)
Deafness/prevention & control , Linguistics/trends , Sign Language , Vision, Ocular/physiology , Deafness/physiopathology , Female , Fingers/physiology , Hand/physiology , Humans , Judgment , Male , Thumb/physiology
5.
Int J Behav Dev ; 45(5): 397-408, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34690387

ABSTRACT

Acquisition of natural language has been shown to fundamentally impact both one's ability to use the first language, and the ability to learn subsequent languages later in life. Sign languages offer a unique perspective on this issue, because Deaf signers receive access to signed input at varying ages. The majority acquires sign language in (early) childhood, but some learn sign language later - a situation that is drastically different from that of spoken language acquisition. To investigate the effect of age of sign language acquisition and its potential interplay with age in signers, we examined grammatical acceptability ratings and reaction time measures in a group of Deaf signers (age range: 28-58 years) with early (0-3 years) or later (4-7 years) acquisition of sign language in childhood. Behavioral responses to grammatical word order variations (subject-object-verb vs. object-subject-verb) were examined in sentences that included: 1) simple sentences, 2) topicalized sentences, and 3) sentences involving manual classifier constructions, uniquely characteristic of sign languages. Overall, older participants responded more slowly. Age of acquisition had subtle effects on acceptability ratings, whereby the direction of the effect depended on the specific linguistic structure.

6.
J Exp Psychol Learn Mem Cogn ; 47(6): 998-1011, 2021 Jun.
Article in English | MEDLINE | ID: mdl-33211523

ABSTRACT

Nonsigners viewing sign language are sometimes able to guess the meaning of signs by relying on the overt connection between form and meaning, or iconicity (cf. Ortega, Özyürek, & Peeters, 2020; Strickland et al., 2015). One word class in sign languages that appears to be highly iconic is classifiers: verb-like signs that can refer to location change or handling. Classifier use and meaning are governed by linguistic rules, yet in comparison with lexical verb signs, classifiers are highly variable in their morpho-phonology (variety of potential handshapes and motion direction within the sign). These open-class linguistic items in sign languages prompt a question about the mechanisms of their processing: Are they part of a gestural-semiotic system (processed like the gestures of nonsigners), or are they processed as linguistic verbs? To examine the psychological mechanisms of classifier comprehension, we recorded the electroencephalogram (EEG) activity of signers who watched videos of signed sentences with classifiers. We manipulated the sentence word order of the stimuli (subject-object-verb [SOV] vs. object-subject-verb [OSV]), contrasting the two conditions, which, according to different processing hypotheses, should incur increased processing costs for OSV orders. As previously reported for lexical signs, we observed an N400 effect for OSV compared with SOV, reflecting increased cognitive load for linguistic processing. These findings support the hypothesis that classifiers are a linguistic part of speech in sign language, extending the current understanding of processing mechanisms at the interface of linguistic form and meaning. (PsycInfo Database Record (c) 2021 APA, all rights reserved).


Subject(s)
Psycholinguistics , Sign Language , Adult , Electroencephalography , Evoked Potentials , Female , Humans , Male , Middle Aged
7.
Wiley Interdiscip Rev Cogn Sci ; 11(1): e1518, 2020 Jan.
Article in English | MEDLINE | ID: mdl-31505710

ABSTRACT

To understand human language-both spoken and signed-the listener or viewer has to parse the continuous external signal into components. The question of what those components are (e.g., phrases, words, sounds, phonemes?) has been a subject of long-standing debate. We re-frame this question to ask: What properties of the incoming visual or auditory signal are indispensable to eliciting language comprehension? In this review, we assess the phenomenon of language parsing from modality-independent viewpoint. We show that the interplay between dynamic changes in the entropy of the signal and between neural entrainment to the signal at syllable level (4-5 Hz range) is causally related to language comprehension in both speech and sign language. This modality-independent Entropy Syllable Parsing model for the linguistic signal offers insight into the mechanisms of language processing, suggesting common neurocomputational bases for syllables in speech and sign language. This article is categorized under: Linguistics > Linguistic Theory Linguistics > Language in Mind and Brain Linguistics > Computational Models of Language Psychology > Language.


Subject(s)
Communication , Linguistics , Comprehension , Humans , Models, Theoretical , Sign Language
8.
Brain Lang ; 200: 104708, 2020 01.
Article in English | MEDLINE | ID: mdl-31698097

ABSTRACT

One of the key questions in the study of human language acquisition is the extent to which the development of neural processing networks for different components of language are modulated by exposure to linguistic stimuli. Sign languages offer a unique perspective on this issue, because prelingually Deaf children who receive access to complex linguistic input later in life provide a window into brain maturation in the absence of language, and subsequent neuroplasticity of neurolinguistic networks during late language learning. While the duration of sensitive periods of acquisition of linguistic subsystems (sound, vocabulary, and syntactic structure) is well established on the basis of L2 acquisition in spoken language, for sign languages, the relative timelines for development of neural processing networks for linguistic sub-domains are unknown. We examined neural responses of a group of Deaf signers who received access to signed input at varying ages to three linguistic phenomena at the levels of classifier signs, syntactic structure, and information structure. The amplitude of the N400 response to the marked word order condition negatively correlated with the age of acquisition for syntax and information structure, indicating increased cognitive load in these conditions. Additionally, the combination of behavioral and neural data suggested that late learners preferentially relied on classifiers over word order for meaning extraction. This suggests that late acquisition of sign language significantly increases cognitive load during analysis of syntax and information structure, but not word-level meaning.


Subject(s)
Aging , Deafness/physiopathology , Deafness/psychology , Electroencephalography , Language Development , Learning/physiology , Linguistics , Sign Language , Adult , Evoked Potentials , Female , Humans , Male , Middle Aged , Neuronal Plasticity , Vocabulary
9.
Autism Res ; 13(1): 24-31, 2020 01.
Article in English | MEDLINE | ID: mdl-31702116

ABSTRACT

Autism spectrum disorder is increasingly understood to be based on atypical signal transfer among multiple interconnected networks in the brain. Relative temporal patterns of neural activity have been shown to underlie both the altered neurophysiology and the altered behaviors in a variety of neurogenic disorders. We assessed brain network dynamics variability in autism spectrum disorders (ASD) using measures of synchronization (phase-locking) strength, and timing of synchronization and desynchronization of neural activity (desynchronization ratio) across frequency bands of resting-state electroencephalography (EEG). Our analysis indicated that frontoparietal synchronization is higher in ASD but with more short periods of desynchronization. It also indicates that the relationship between the properties of neural synchronization and behavior is different in ASD and typically developing populations. Recent theoretical studies suggest that neural networks with a high desynchronization ratio have increased sensitivity to inputs. Our results point to the potential significance of this phenomenon to the autistic brain. This sensitivity may disrupt the production of an appropriate neural and behavioral responses to external stimuli. Cognitive processes dependent on the integration of activity from multiple networks maybe, as a result, particularly vulnerable to disruption. Autism Res 2020, 13: 24-31. © 2019 International Society for Autism Research, Wiley Periodicals, Inc. LAY SUMMARY: Parts of the brain can work together by synchronizing the activity of the neurons. We recorded the electrical activity of the brain in adolescents with autism spectrum disorder and then compared the recording to that of their peers without the diagnosis. We found that in participants with autism, there were a lot of very short time periods of non-synchronized activity between frontal and parietal parts of the brain. Mathematical models show that the brain system with this kind of activity is very sensitive to external events.


Subject(s)
Autism Spectrum Disorder/physiopathology , Brain/physiopathology , Electroencephalography/methods , Adolescent , Child , Female , Humans , Male , Neural Pathways/physiopathology
10.
Neuropsychologia ; 126: 138-146, 2019 03 18.
Article in English | MEDLINE | ID: mdl-28633887

ABSTRACT

Communication deficits in children with autism spectrum disorders (ASD) are often related to inefficient interpretation of emotional cues, which are conveyed visually through both facial expressions and body language. The present study examined ASD behavioral and ERP responses to emotional expressions of anger and fear, as conveyed by the face and body. Behavioral results showed significantly faster response times for the ASD than for the typically developing (TD) group when processing fear, but not anger, in isolated face expressions, isolated body expressions, and in the integration of the two. In addition, EEG data for the N170 and P1 indicated processing differences between fear and anger stimuli only in TD group, suggesting that individuals with ASD may not be distinguishing between emotional expressions. These results suggest that ASD children may employ a different neural mechanism for visual emotion recognition than their TD peers, possibly relying on inferential processing.


Subject(s)
Anger/physiology , Autism Spectrum Disorder/physiopathology , Child Development/physiology , Evoked Potentials/physiology , Facial Expression , Facial Recognition/physiology , Fear/physiology , Recognition, Psychology/physiology , Social Perception , Adolescent , Adolescent Development , Child , Cues , Electroencephalography , Female , Humans , Male
11.
Cortex ; 112: 69-79, 2019 03.
Article in English | MEDLINE | ID: mdl-30001920

ABSTRACT

The question of apparent discrepancies in short-term memory capacity for sign language and speech has long presented difficulties for the models of verbal working memory. While short-term memory (STM) capacity for spoken language spans up to 7 ± 2 items, the verbal working memory capacity for sign languages appears to be lower at 5 ± 2. The assumption that both auditory and visual communication (sign language) rely on the same memory buffers led to the claims of impairment of STM buffers in sign language users. Yet, no common model deals with both the sensory and linguistic nature of spoken and sign languages. The authors present a generalized neural model (GNM) of short-term memory use across modalities, which accounts for experimental results in both sign and spoken languages. GNM postulates that during hierarchically organized processing phases in language comprehension, spoken language users rely on neural resources for spatial representation in sequential rehearsal strategy, i.e., the phonological loop. The spatial nature of sign language precludes signers from utilizing a similar 'overflow' strategy, which speakers rely on to extend their STM capacity. This model offers a parsimonious neuroarchitectural explanation for the conflict between spatial and linguistic processing in spoken language, as well as the differences observed in STM capacity for sign and speech.


Subject(s)
Auditory Perception/physiology , Memory, Short-Term/physiology , Models, Neurological , Sign Language , Speech/physiology , Visual Perception/physiology , Humans , Verbal Learning/physiology
12.
Wiley Interdiscip Rev Cogn Sci ; 10(2): e1484, 2019 Mar.
Article in English | MEDLINE | ID: mdl-30417551

ABSTRACT

This review compares how humans process action and language sequences produced by other humans. On the one hand, we identify commonalities between action and language processing in terms of cognitive mechanisms (e.g., perceptual segmentation, predictive processing, integration across multiple temporal scales), neural resources (e.g., the left inferior frontal cortex), and processing algorithms (e.g., comprehension based on changes in signal entropy). On the other hand, drawing on sign language with its particularly strong motor component, we also highlight what differentiates (both oral and signed) linguistic communication from nonlinguistic action sequences. We propose the multiscale information transfer framework (MSIT) as a way of integrating these insights and highlight directions into which future empirical research inspired by the MSIT framework might fruitfully evolve. This article is categorized under: Psychology > Language Linguistics > Language in Mind and Brain Psychology > Motor Skill and Performance Psychology > Prediction.


Subject(s)
Brain , Cognition , Nonverbal Communication , Psycholinguistics , Social Perception , Speech Perception , Visual Perception , Humans
13.
Brain Res ; 1691: 105-117, 2018 07 15.
Article in English | MEDLINE | ID: mdl-29627484

ABSTRACT

Research on spoken languages has identified a "subject preference" processing strategy for tackling input that is syntactically ambiguous as to whether a sentence-initial NP is a subject or object. The present study documents that the "subject preference" strategy is also seen in the processing of a sign language, supporting the hypothesis that the "subject"-first strategy is universal and not dependent on the language modality (spoken vs. signed). Deaf signers of Austrian Sign Language (ÖGS) were shown videos of locally ambiguous signed sentences in SOV and OSV word orders. Electroencephalogram (EEG) data indicated higher cognitive load in response to OSV stimuli (i.e. a negativity for OSV compared to SOV), indicative of syntactic reanalysis cost. A finding that is specific to the visual modality is that the ERP (event-related potential) effect reflecting linguistic reanalysis occurred earlier than might have been expected, that is, before the time point when the path movement of the disambiguating sign was visible. We suggest that in the visual modality, transitional movement of the articulators prior to the disambiguating verb position or co-occurring non-manual (face/body) markings were used in resolving the local ambiguity in ÖGS. Thus, whereas the processing strategy of "subject preference" is cross-modal at the linguistic level, the cues that enable the processor to apply that strategy differ in signing as compared to speech.


Subject(s)
Comprehension/physiology , Evoked Potentials/physiology , Linguistics , Sign Language , Speech , Brain/physiology , Electroencephalography , Female , Humans , Language , Male , Photic Stimulation , Reaction Time/physiology , Time Factors
14.
Lang Speech ; 61(1): 97-112, 2018 03.
Article in English | MEDLINE | ID: mdl-28565932

ABSTRACT

The ability to convey information is a fundamental property of communicative signals. For sign languages, which are overtly produced with multiple, completely visible articulators, the question arises as to how the various channels co-ordinate and interact with each other. We analyze motion capture data of American Sign Language (ASL) narratives, and show that the capacity of information throughput, mathematically defined, is highest on the dominant hand (DH). We further demonstrate that information transfer capacity is also significant for the non-dominant hand (NDH), and the head channel too, as compared to control channels (ankles). We discuss both redundancy and independence in articulator motion in sign language, and argue that the NDH and the head articulators contribute to the overall information transfer capacity, indicating that they are neither completely redundant to, nor completely independent of, the DH.


Subject(s)
Comprehension , Hand/physiology , Movement , Sign Language , Visual Perception , Algorithms , Functional Laterality , Head Movements , Humans , Image Processing, Computer-Assisted , Time Factors , Video Recording
15.
Behav Brain Sci ; 40: e63, 2017 01.
Article in English | MEDLINE | ID: mdl-29342520

ABSTRACT

State-of-the-art methods of analysis of video data now include motion capture and optical flow from video recordings. These techniques allow for biological differentiation between visual communication and noncommunicative motion, enabling further inquiry into neural bases of communication. The requirements for additional noninvasive methods of data collection and automatic analysis of natural gesture and sign language are discussed.


Subject(s)
Gestures , Sign Language , Humans , Language
16.
Exp Brain Res ; 234(12): 3425-3431, 2016 12.
Article in English | MEDLINE | ID: mdl-27465558

ABSTRACT

The heterogeneity of behavioral manifestation of autism spectrum disorders (ASDs) requires a model which incorporates understanding of dynamic differences in neural processing between ASD and typically developing (TD) populations. We use network approach to characterization of spatiotemporal dynamics of EEG data in TD and ASD youths. EEG recorded during both wakeful rest (resting state) and a social-visual task was analyzed using cross-correlation analysis of the 32-channel time series to produce weighted, undirected graphs corresponding to functional brain networks. The stability of these networks was assessed by novel use of the L1-norm for matrix entries (edit distance). There were a significantly larger number of stable networks observed in the resting condition compared to the task condition in both populations. In resting state, stable networks persisted for a significantly longer time in children with ASD than in TD children; networks in ASD children also had larger diameter, indicative of long-range connectivity. The resulting analysis combines key features of microstate and network analyses of EEG.


Subject(s)
Autism Spectrum Disorder/pathology , Brain/physiopathology , Evoked Potentials/physiology , Neural Pathways/physiology , Nonlinear Dynamics , Adolescent , Analysis of Variance , Child , Electroencephalography , Female , Humans , Male , Models, Neurological , Time Factors
17.
Neurocase ; 21(6): 753-66, 2015.
Article in English | MEDLINE | ID: mdl-25529497

ABSTRACT

Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.


Subject(s)
Brain/physiopathology , Comprehension/physiology , Linguistics , Reading , Adolescent , Adult , Brain Mapping , Female , Humans , Magnetic Resonance Imaging , Male , Semantics , Young Adult
18.
J Psycholinguist Res ; 44(5): 533-44, 2015 Oct.
Article in English | MEDLINE | ID: mdl-24866361

ABSTRACT

This study examined the electrophysiological signatures of deductive and probabilistic reasoning. Deduction is defined as the case in which a conclusion can be found to be true or false due to validity of argument. In probabilistic reasoning, however, conclusions can be considered to be likely or unlikely, but not with certainty due to the lack of validity in the form of the argument. 16 participants were presented with both types of arguments while response times and ERPs were carried out. Participants had to decide with the presentation of each argument, what type of reasoning was appropriate and which of four responses (certainly yes, probably yes, probably no and certainly no) was the most appropriate. Response times indicated faster processing of deductive arguments. N2 amplitude distinguished between positive and negative responses in the deductive condition, but not in the probabilistic one, suggesting partial differentiation between the cognitive processes required for the two types of reasoning.


Subject(s)
Brain/physiology , Cognition/physiology , Evoked Potentials/physiology , Problem Solving/physiology , Adult , Electroencephalography , Female , Humans , Judgment/physiology , Male , Young Adult
19.
PeerJ ; 2: e446, 2014.
Article in English | MEDLINE | ID: mdl-25024915

ABSTRACT

Prior studies investigating cortical processing in Deaf signers suggest that life-long experience with sign language and/or auditory deprivation may alter the brain's anatomical structure and the function of brain regions typically recruited for auditory processing (Emmorey et al., 2010; Pénicaud et al., 2013 inter alia). We report the first investigation of the task-negative network in Deaf signers and its functional connectivity-the temporal correlations among spatially remote neurophysiological events. We show that Deaf signers manifest increased functional connectivity between posterior cingulate/precuneus and left medial temporal gyrus (MTG), but also inferior parietal lobe and medial temporal gyrus in the right hemisphere- areas that have been found to show functional recruitment specifically during sign language processing. These findings suggest that the organization of the brain at the level of inter-network connectivity is likely affected by experience with processing visual language, although sensory deprivation could be another source of the difference. We hypothesize that connectivity alterations in the task negative network reflect predictive/automatized processing of the visual signal.

20.
Brain Cogn ; 86: 98-103, 2014 Apr.
Article in English | MEDLINE | ID: mdl-24607732

ABSTRACT

The impact of handedness on language processing has been studied extensively and the results indicate that there is a relationship between the two variables; however, the nature of the relationship is not at all clear. In the current study we explored degree of handedness (DH) opposed to direction in a group of right-handed individuals. fMRI was used to explore the impact of DH on the sentence comprehension network. The results revealed that during sentence comprehension activation in regions linked to semantic memory (e.g., anterior temporal cortex) were modulated by DH. Also, unexpectedly the precuneus/posterior cingulate gyrus which has been linked to episodic memory was also affected by DH. These results extend those reported previously by showing that the neural architecture that supports sentence comprehension is modulated by DH. More specifically, together the results presented here support the hypothesis proposed by Townsend, Carrithers, and Bever (2001) that DH interacts with the language system and impacts the strategy used during sentence comprehension.


Subject(s)
Brain/physiology , Comprehension/physiology , Functional Laterality/physiology , Language , Adult , Brain Mapping , Female , Humans , Magnetic Resonance Imaging , Male , Nerve Net/physiology , Semantics , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...