Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 95
Filter
1.
Brain Topogr ; 2024 Apr 16.
Article in English | MEDLINE | ID: mdl-38625520

ABSTRACT

The literature has demonstrated the potential for detecting accurate electrical signals that correspond to the will or intention to move, as well as decoding the thoughts of individuals who imagine houses, faces or objects. This investigation examines the presence of precise neural markers of imagined motivational states through the combining of electrophysiological and neuroimaging methods. 20 participants were instructed to vividly imagine the desire to move, listen to music or engage in social activities. Their EEG was recorded from 128 scalp sites and analysed using individual standardized Low-Resolution Brain Electromagnetic Tomographies (LORETAs) in the N400 time window (400-600 ms). The activation of 1056 voxels was examined in relation to the 3 motivational states. The most active dipoles were grouped in eight regions of interest (ROI), including Occipital, Temporal, Fusiform, Premotor, Frontal, OBF/IF, Parietal, and Limbic areas. The statistical analysis revealed that all motivational imaginary states engaged the right hemisphere more than the left hemisphere. Distinct markers were identified for the three motivational states. Specifically, the right temporal area was more relevant for "Social Play", the orbitofrontal/inferior frontal cortex for listening to music, and the left premotor cortex for the "Movement" desire. This outcome is encouraging in terms of the potential use of neural indicators in the realm of brain-computer interface, for interpreting the thoughts and desires of individuals with locked-in syndrome.

2.
Front Psychiatry ; 15: 1357770, 2024.
Article in English | MEDLINE | ID: mdl-38638416

ABSTRACT

Introduction: The capacity to understand the others' emotional states, particularly if negative (e.g. sadness or fear), underpins the empathic and social brain. Patients who cannot express their emotional states experience social isolation and loneliness, exacerbating distress. We investigated the feasibility of detecting non-invasive scalp-recorded electrophysiological signals that correspond to recalled emotional states of sadness, fear, and joy for potential classification. Methods: The neural activation patterns of 20 healthy and right-handed participants were studied using an electrophysiological technique. Analyses were focused on the N400 component of Event-related potentials (ERPs) recorded during silent recall of subjective emotional states; Standardized weighted Low-resolution Electro-magnetic Tomography (swLORETA) was employed for source reconstruction. The study classified individual patterns of brain activation linked to the recollection of three distinct emotional states into seven regions of interest (ROIs). Results: Statistical analysis (ANOVA) of the individual magnitude values revealed the existence of a common emotional circuit, as well as distinct brain areas that were specifically active during recalled sad, happy and fearful states. In particular, the right temporal and left superior frontal areas were more active for sadness, the left limbic region for fear, and the right orbitofrontal cortex for happy affective states. Discussion: In conclusion, this study successfully demonstrated the feasibility of detecting scalp-recorded electrophysiological signals corresponding to internal and subjective affective states. These findings contribute to our understanding of the emotional brain, and have potential applications for future BCI classification and identification of emotional states in LIS patients who may be unable to express their emotions, thus helping to alleviate social isolation and sense of loneliness.

3.
Sci Rep ; 14(1): 3506, 2024 02 12.
Article in English | MEDLINE | ID: mdl-38347056

ABSTRACT

Considerable evidence suggests that musical education induces structural and functional neuroplasticity in the brain. This study aimed to explore the potential impact of such changes on word-reading proficiency. We investigated whether musical training promotes the development of uncharted orthographic regions in the right hemisphere leading to better reading abilities. A total of 60 healthy, right-handed culturally matched professional musicians and controls took part in this research. They were categorised as normo-typical readers based on their reading speed (syl/sec) and subdivided into two groups of relatively good and poor readers. High density EEG/ERPs were recorded while participants engaged in a note or letter detection task. Musicians were more fluent in word, non-word and text reading tests, and faster in detecting both notes and words. They also exhibited greater N170 and P300 responses, and target-non target differences for words than controls. Similarly, good readers showed larger N170 and P300 responses than poor readers. Increased reading skills were associated to a bilateral activation of the occipito/temporal cortex, during music and word reading. Source reconstruction also showed a reduced activation of the left fusiform gyrus, and of areas devoted to attentional/ocular shifting in poor vs. good readers, and in controls vs. musicians. Data suggest that music literacy acquired early in time can shape reading circuits by promoting the specialization of a right-sided reading area, whose activity was here associated with enhanced reading proficiency. In conclusion, music literacy induces measurable neuroplastic changes in the left and right OT cortex responsible for improved word reading ability.


Subject(s)
Dyslexia , Music , Humans , Reading , Literacy , Brain/physiology , Magnetic Resonance Imaging
5.
iScience ; 26(10): 108085, 2023 Oct 20.
Article in English | MEDLINE | ID: mdl-37860769

ABSTRACT

Racial bias-nonconscious behavioral inclinations against people of other ethnic groups-heavily contributes to inequality and discrimination. Immersive virtual reality (IVR) can reduce implicit racial bias through the feeling of owning (embodying) a virtual body of a different "race"; however, it has been demonstrated only behaviorally for the implicit attitudes. Here, we investigated the implicit (racial IAT) and the neurophysiological (the N400 component of the event-related potentials for verbal stimuli that violated negative racial stereotypes) correlates of the embodiment-induced reduction of the implicit racial bias. After embodying a Black avatar, Caucasian participants had reduced implicit racial bias (IAT) but both groups showed the typical N400. This is the first evidence to suggest that virtual embodiment affects the evaluative component of the implicit biases but not the neurophysiological index of their cognitive component (i.e., stereotyping). This can inform interventions that promote inclusivity through the implicit/indirect procedures, such as embodiment.

6.
Article in English | MEDLINE | ID: mdl-37535483

ABSTRACT

Brain-computer interfaces (BCIs) have revolutionized the way humans interact with machines, particularly for patients with severe motor impairments. EEG-based BCIs have limited functionality due to the restricted pool of stimuli that they can distinguish, while those elaborating event-related potentials up to now employ paradigms that require the patient's perception of the eliciting stimulus. In this work, we propose MIRACLE: a novel BCI system that combines functional data analysis and machine-learning techniques to decode patients' minds from the elicited potentials. MIRACLE relies on a hierarchical ensemble classifier recognizing 10 different semantic categories of imagined stimuli. We validated MIRACLE on an extensive dataset collected from 20 volunteers, with both imagined and perceived stimuli, to compare the system performance on the two. Furthermore, we quantify the importance of each EEG channel in the decision-making process of the classifier, which can help reduce the number of electrodes required for data acquisition, enhancing patients' comfort.


Subject(s)
Brain-Computer Interfaces , Electroencephalography , Humans , Electroencephalography/methods , Evoked Potentials , Electrodes
8.
Psychophysiology ; 60(6): e14301, 2023 06.
Article in English | MEDLINE | ID: mdl-37017263

ABSTRACT

The C1 ERP component reflects the earliest visual processing in V1. However, it remains debated whether attentional load can influence it or not. We conducted two EEG experiments to investigate the effect of attentional load on the C1. Task difficulty was manipulated at fixation using an oddball detection task that was either easy (low load) or difficult (high load), while the distractor was presented in the upper visual field (UVF) to score the C1. In Experiment 1, we used a block design and the stimulus onset asynchrony (SOA) between the central stimulus and the peripheral distractor was either short or long. In Experiment 2, task difficulty was manipulated on a trial-by-trial basis using a visual cue, and the peripheral distractor was presented either before or after the central stimulus. The results showed that the C1 was larger in the high compared to the low load condition irrespective of SOA in Experiment 1. In Experiment 2, no significant load modulation of the C1 was observed. However, we found that the contingent negative variation (CNV) was larger in the low compared to the high load condition. Moreover, the C1 was larger when the peripheral distractor was presented after than before the central stimulus. Combined together, these results suggest that different top-down control processes can influence the initial feedforward stage of visual processing in V1 captured by the C1 ERP component.


Subject(s)
Electroencephalography , Evoked Potentials, Visual , Humans , Photic Stimulation/methods , Electroencephalography/methods , Attention , Visual Perception , Reaction Time
9.
Soc Neurosci ; 18(1): 46-64, 2023 02.
Article in English | MEDLINE | ID: mdl-37058081

ABSTRACT

The aim of this study was to investigate the neural underpinnings and the time course of emoji recognition through the recording of event-related potentials in 51 participants engaged in a categorization task involving an emotional word paradigm. Forty-eight happy, sad, surprised, disgusted, fearful, angry emojis, and as many facial expressions, were used as stimuli. Behavioral data showed that emojis were recognized faster and more accurately (92.7%) than facial expressions displaying the same emotions (87.35%). Participants were better at recognizing happy, disgusted, and sad emojis, and happy and angry faces. Fear was difficult to recognize in both faces and emojis. The N400 response was larger to incongruently primed emojis and faces, while the opposite was observed for the P300 component. However, both N400 and P300 were considerably later in response to faces than emojis. The emoji-related N170 component (150-190 ms) discriminated stimulus affective content, similar to face-related N170, but its neural generators did not include the face fusiform area but the occipital face area (OFA) for processing face details, and object-related areas. Both faces and emojis activated the limbic system and the orbitofrontal cortex supporting anthropomorphization. The schematic nature of emojis might determine an easier classification of their emotional content.


Subject(s)
Evoked Potentials , Facial Recognition , Humans , Male , Female , Evoked Potentials/physiology , Electroencephalography/methods , Facial Expression , Emotions/physiology , Neuroimaging , Facial Recognition/physiology
10.
Behav Sci (Basel) ; 13(3)2023 Mar 22.
Article in English | MEDLINE | ID: mdl-36975303

ABSTRACT

Emojis are colorful ideograms resembling stylized faces commonly used for expressing emotions in instant messaging, on social network sites, and in email communication. Notwithstanding their increasing and pervasive use in electronic communication, they are not much investigated in terms of their psychological properties and communicative efficacy. Here, we presented 112 different human facial expressions and emojis (expressing neutrality, happiness, surprise, sadness, anger, fear, and disgust) to a group of 96 female and male university students engaged in the recognition of their emotional meaning. Analyses of variance showed that male participants were significantly better than female participants at recognizing emojis (especially negative ones) while the latter were better than male participants at recognizing human facial expressions. Quite interestingly, male participants were better at recognizing emojis than human facial expressions per se. These findings are in line with more recent evidence suggesting that male individuals may be more competent and inclined to use emojis to express their emotions in messaging (especially sarcasm, teasing, and love) than previously thought. Finally, the data indicate that emojis are less ambiguous than facial expressions (except for neutral and surprise emotions), possibly because of the limited number of fine-grained details and the lack of morphological features conveying facial identity.

11.
Psychophysiology ; 60(8): e14280, 2023 08.
Article in English | MEDLINE | ID: mdl-36847283

ABSTRACT

Previous research suggests that masks disrupt expression recognition, but the neurophysiological implications of this phenomenon are poorly understood. In this study, 26 participants underwent EEG/ERP recording during the recognition of six masked/unmasked facial expressions. An emotion/word congruence paradigm was used. Face-specific N170 was significantly larger to masked than unmasked faces. The N400 component was larger for incongruent faces, but differences were more substantial for positive emotions (especially happiness). Anterior P300 (reflecting workload) was larger to masked than unmasked faces, while posterior P300 (reflecting categorization certainty) was larger to unmasked than masked faces, and to angry faces. Face masking was more detrimental to sadness, fear, and disgust than positive emotions, such as happiness. In addition, mask covering did not impair the recognition of angry faces, as the wrinkled forehead and frowning eyebrows remained visible. Overall, facial masking polarized nonverbal communication toward the happiness/anger dimension, while minimizing emotions that stimulate an empathic response.


Subject(s)
Empathy , Facial Recognition , Humans , Male , Female , Masks , Electroencephalography , Facial Expression , Evoked Potentials/physiology , Emotions/physiology , Facial Recognition/physiology
12.
Brain Cogn ; 166: 105954, 2023 03.
Article in English | MEDLINE | ID: mdl-36657242

ABSTRACT

This study aimed to investigate the psychophysiological markers of imagery processes through EEG/ERP recordings. Visual and auditory stimuli representing 10 different semantic categories were shown to 30 healthy participants. After a given interval and prompted by a light signal, participants were asked to activate a mental image corresponding to the semantic category for recording synchronized electrical potentials. Unprecedented electrophysiological markers of imagination were recorded in the absence of sensory stimulation. The following peaks were identified at specific scalp sites and latencies, during imagination of infants (centroparietal positivity, CPP, and late CPP), human faces (anterior negativity, AN), animals (anterior positivity, AP), music (P300-like), speech (N400-like), affective vocalizations (P2-like) and sensory (visual vs auditory) modality (PN300). Overall, perception and imagery conditions shared some common electro/cortical markers, but during imagery the category-dependent modulation of ERPs was long latency and more anterior, with respect to the perceptual condition. These ERP markers might be precious tools for BCI systems (pattern recognition, classification, or A.I. algorithms) applied to patients affected by consciousness disorders (e.g., in a vegetative or comatose state) or locked-in-patients (e.g., spinal or SLA patients).


Subject(s)
Electroencephalography , Evoked Potentials , Animals , Humans , Male , Female , Imagination/physiology , Auditory Perception
13.
J Neurosci Res ; 101(5): 730-738, 2023 05.
Article in English | MEDLINE | ID: mdl-33608982

ABSTRACT

Many studies have reported sex differences in empathy and social skills. In this review, several lines of empirical evidences about sex differences in functions and anatomy of social brain are discussed. The most relevant differences involve face processing, facial expression recognition, response to baby schema, the ability to see faces in things, the processing of social interactions, the response to the others' pain, interest in social information, processing of gestures and actions, biological motion, erotic, and affective stimuli. Sex differences in oxytocin-based parental response are also reported. In conclusion, the female and male brains show several neuro-functional differences in various aspects of social cognition, and especially in emotional coding, face processing, and response to baby schema. An interpretation of this sexual dimorphism is provided in the view of evolutionary psychobiology.


Subject(s)
Sex Characteristics , Social Cognition , Male , Female , Humans , Brain/physiology , Emotions/physiology , Empathy , Facial Expression , Cognition/physiology
14.
Front Behav Neurosci ; 16: 1025870, 2022.
Article in English | MEDLINE | ID: mdl-36523756

ABSTRACT

Objective: A majority of BCI systems, enabling communication with patients with locked-in syndrome, are based on electroencephalogram (EEG) frequency analysis (e.g., linked to motor imagery) or P300 detection. Only recently, the use of event-related brain potentials (ERPs) has received much attention, especially for face or music recognition, but neuro-engineering research into this new approach has not been carried out yet. The aim of this study was to provide a variety of reliable ERP markers of visual and auditory perception for the development of new and more complex mind-reading systems for reconstructing the mental content from brain activity. Methods: A total of 30 participants were shown 280 color pictures (adult, infant, and animal faces; human bodies; written words; checkerboards; and objects) and 120 auditory files (speech, music, and affective vocalizations). This paradigm did not involve target selection to avoid artifactual waves linked to decision-making and response preparation (e.g., P300 and motor potentials), masking the neural signature of semantic representation. Overall, 12,000 ERP waveforms × 126 electrode channels (1 million 512,000 ERP waveforms) were processed and artifact-rejected. Results: Clear and distinct category-dependent markers of perceptual and cognitive processing were identified through statistical analyses, some of which were novel to the literature. Results are discussed from the view of current knowledge of ERP functional properties and with respect to machine learning classification methods previously applied to similar data. Conclusion: The data showed a high level of accuracy (p ≤ 0.01) in the discriminating the perceptual categories eliciting the various electrical potentials by statistical analyses. Therefore, the ERP markers identified in this study could be significant tools for optimizing BCI systems [pattern recognition or artificial intelligence (AI) algorithms] applied to EEG/ERP signals.

15.
Sci Rep ; 12(1): 19822, 2022 11 17.
Article in English | MEDLINE | ID: mdl-36396694

ABSTRACT

Musical learning is related to the development of audio-visuomotor associations linking gestures with musical sounds. To study the role of auditory feedback in learning, 115 students (56 guitarists, 59 pianists) at the beginner, intermediate and advanced levels were recruited. Playing with sound (audio-motor feedback), mute practice (motor feedback), and piece listening (auditory feedback) were compared to first sight reading to assess the role of auditory and motor feedback in procedural learning. The procedure consisted of the execution of a standard piece for determining the students' level and 4 further music executions (every week for 4 weeks), preceded by different practice conditions (for 12 min, once a day, for 5 days). Real musical pieces (e.g., Segovia, Schubert, Bartók) were used. Performance evaluation focused on four macro-categories: note, rhythm, dynamics and smoothness. For both instruments, first-sight reading (A - M -) was associated with the worst performance: silent motor practice (A - M +) resulted in learning the rhythmic structure of the piece and in a smoother performance. Listening to pieces (A + M -) resulted in learning the agogics and in improving articulation and smoothness. Listening during performance (A + M +) resulted in fewer intonation errors. Interestingly, auditory feedback was more relevant for beginners than for advanced students, as evidenced by the greater benefits of listening during practice.


Subject(s)
Music , Humans , Feedback , Feedback, Sensory , Auditory Perception , Learning
16.
Front Psychol ; 13: 943613, 2022.
Article in English | MEDLINE | ID: mdl-35992482

ABSTRACT

Cognitive neuroscience has inspired a number of methodological advances to extract the highest signal-to-noise ratio from neuroimaging data. Popular techniques used to summarize behavioral data include sum-scores and item response theory (IRT). While these techniques can be useful when applied appropriately, item dimensionality and the quality of information are often left unexplored allowing poor performing items to be included in an itemset. The purpose of this study is to highlight how the application of two-stage approaches introduces parameter bias, differential item functioning (DIF) can manifest in cognitive neuroscience data and how techniques such as the multiple indicator multiple cause (MIMIC) model can identify and remove items with DIF and model these data with greater sensitivity for brain-behavior relationships. This was performed using a simulation and an empirical study. The simulation explores parameter bias across two separate techniques used to summarize behavioral data: sum-scores and IRT and formative relationships with those estimated from a MIMIC model. In an empirical study participants performed an emotional identification task while concurrent electroencephalogram data were acquired across 384 trials. Participants were asked to identify the emotion presented by a static face of a child across four categories: happy, neutral, discomfort, and distress. The primary outcomes of interest were P200 event-related potential (ERP) amplitude and latency within each emotion category. Instances of DIF related to correct emotion identification were explored with respect to an individual's neurophysiology; specifically an item's difficulty and discrimination were explored with respect to an individual's average P200 amplitude and latency using a MIMIC model. The MIMIC model's sensitivity was then compared to popular two-stage approaches for cognitive performance summary scores, including sum-scores and an IRT model framework and then regressing these onto the ERP characteristics. Here sensitivity refers to the magnitude and significance of coefficients relating the brain to these behavioral outcomes. The first set of analyses displayed instances of DIF within all four emotions which were then removed from all further models. The next set of analyses compared the two-stage approaches with the MIMIC model. Only the MIMIC model identified any significant brain-behavior relationships. Taken together, these results indicate that item performance can be gleaned from subject-specific biomarkers, and that techniques such as the MIMIC model may be useful tools to derive complex item-level brain-behavior relationships.

17.
PLoS One ; 16(11): e0260540, 2021.
Article in English | MEDLINE | ID: mdl-34818377

ABSTRACT

The present study used EEG/ERPs to detect the activation of implicit stereotypical representations associated to other-race (OR) people and the modulation of such activation through the previous presentation of positive vs. neutral social information. Electrophysiological signals were recorded in 40 Italian Caucasian participants, unaware of the overall study's purpose. They were presented with 285 sentences that could either violate, non-violate (e.g., "the Roma girl was involved in a robbery) or be neutral with regard to stereotypical concepts concerning other-race people (e.g. Asians, Africans, Arabic). ERPs were time-locked to the terminal words. Prior to the sentence reading task, participants were exposed to a 10 minutes colourful video documentary. While the experimental group was presented a video containing images picturing other-race characters involved in "prestigious" activities that violated stereotypical negative assumptions (e.g. a black neurosurgeon leading a surgery team), the control group viewed a neutral documentary about flora and fauna. EEG signals were then recorded during the sentence reading task to explore whether the previous exposure to the experimental video could modulate the detection of incongruence in the sentences violating stereotypes, as marked by the N400 response. A fictitious task was adopted, consisted in detecting rare animal names. Indeed, only the control group showed a greater N400 response (350-550 ms) to words incongruent with ethnic stereotypes compared to congruent and neutral ones, thus suggesting the presence of a racial bias. No N400 response was found for the experimental group, suggesting a lack of negative expectation for OR individuals. The swLORETA inverse solution, performed on the prejudice-related N400 showed that the Inferior Temporal and the Superior and Middle Frontal Gyri were the strongest N400 intra-cortical sources. Regardless of the experimental manipulation, Congruent terminal words evoked a greater P300 response (500-600 ms) compared to incongruent and neutral ones and a late frontal positivity (650-800 ms) was found to be larger to sentences involving OR than own-race characters (either congruent or incongruent with the prejudice) thus possibly indicating bias-free perceptual in-group/out-group categorization processes. The data showed how it is possible to modulate a pre-existing racial prejudice (as reflected by N400 effect) through exposure to positive media-driven information about OR people. Further follow-up studies should determine the duration in time, and across contexts, of this modulatory effect.


Subject(s)
Racism , Adult , Electroencephalography , Evoked Potentials , Female , Humans , Learning , Male , Reading , Stereotyped Behavior , Young Adult
18.
Eur J Neurosci ; 54(7): 6553-6574, 2021 10.
Article in English | MEDLINE | ID: mdl-34486754

ABSTRACT

N40 is a well-known component of evoked potentials with respect to the auditory and somatosensory modality but not much recognized with regard to the visual modality. To be detected with event-related potentials (ERPs), it requires an optimal signal-to-noise ratio. To investigate the nature of visual N40, we recorded EEG/ERP signals from 20 participants. Each of them was presented with 1800 spatial frequency gratings of 0.75, 1.5, 3 and 6 c/deg. Data were collected from 128 sites while participants were engaged in both passive viewing and attention conditions. N40 (30-55 ms) was modulated by alertness and selective attention; in fact, it was larger to targets than irrelevant and passively viewed spatial frequency gratings. Its strongest intracranial sources were the bilateral thalamic nuclei of pulvinar, according to swLORETA. The active network included precuneus, insula and inferior parietal lobule. An N80 component (60-90 ms) was also identified, which was larger to targets than irrelevant/passive stimuli and more negative to high than low spatial frequencies. In contrast, N40 was not sensitive to spatial frequency per se, nor did it show a polarity inversion as a function of spatial frequency. Attention, alertness and spatial frequency effects were also found for the later components P1, N2 and P300. The attentional effects increased in magnitude over time. The data showed that ERPs can pick up the earliest synchronized activity, deriving in part from thalamic nuclei, before the visual information has actually reached the occipital cortex.


Subject(s)
Evoked Potentials, Visual , Scalp , Attention , Electroencephalography , Evoked Potentials , Evoked Potentials, Auditory , Humans , Photic Stimulation
19.
Neuropsychologia ; 155: 107808, 2021 05 14.
Article in English | MEDLINE | ID: mdl-33636156

ABSTRACT

The present investigation used ERPs to detect the activation of implicit stereotypical representations associated to different ethnic groups, by means of an implicit paradigm. 285 sentences were presented to 20 Italian Caucasian participants while EEG signals were recorded from 128 scalp sites. Sentences could either violate (Incongruent condition), non-violate (Congruent condition) or be neutral (Neutral condition) with regard to stereotypical concepts concerning non-Caucasian ethnic groups (e.g. Asians, Africans, Arabs). No awareness or judgment about stereotypes was required. Participants were engaged in a fictitious task, ignoring the overall study's purpose. The results showed that Incongruent terminal words elicited a greater anterior N400 response (300-500 ms) compared to Congruent and Neutral words, reflecting a difficulty in integrating the information incongruent with pre-existing stereotypical knowledge. The participant's individual amplitude values of the N400-Difference Wave (Incongruent - Congruent), showed a direct correlation with the individual racism scores obtained at the Subtle and Blatant Prejudice Scale, administered at the end of the experimental session. Intra-cortical sources explaining the N400 involved areas of the social cognition network such as the medial frontal cortex (BA10) and the inferior temporal gyrus (BA20) which are known to support processing of information about other people and impression formation. Moreover, Congruent terminal words evoked a greater P300 response (500-600 ms) compared to the other conditions, possibly reflecting the merging of incoming inputs with anticipated semantic information. A late post-N400 frontal positivity (650-800 ms) was found to be larger to sentences concerning other-race characters (ether congruent or incongruent) compared to sentences involving own-race characters (neutral). The study corroborated the effectiveness of neurophysiological measures to assess implicit complex semantic representations and circumventing social desirability-related problems.


Subject(s)
Electroencephalography , Ethnicity , Evoked Potentials , Female , Humans , Judgment , Male , Semantics
20.
Heliyon ; 6(11): e05570, 2020 Nov.
Article in English | MEDLINE | ID: mdl-33294702

ABSTRACT

Several studies showed that shifting of visuospatial attention modulates sensory processing at multiple levels of the visual pathways and beyond, including the occipital striate cortices level. However, inconsistent findings have been reported thus leaving these issues still disputed. 21 participants took part to the present study (the EEG signals of 4 of them were discarded due to artifacts). We used ERPs and their neural sources to investigate whether shifting spatial attention in space across the horizontal meridian of the visual field affected striate cortices activation at the earliest latency. Time-series of scalp topographical maps indicated that, unlike ERPs to attentional-neutral central cues, ERPs to attention-directing local cues showed earliest polarity inversions as a function of stimulated field and processing latency range considered, at occipital-parietal electrodes. In between 60-75 ms, attentional shifting cues elicited a positivity for both visual fields, whereas at a later latency (75-90 ms) they elicited a positivity and a negativity for the upper and lower visual hemifields, respectively. Computed neural sources included striate, besides extrastriate, cortices for both visual hemifields and latency ranges. Conjointly, behavioral responses to targets were faster when they were preceded by local than by neutral cues, and when presented in the upper than the lower hemifield. Our findings support the hypothesis that attention shifts may affect early sensory processing in visual cortices.

SELECTION OF CITATIONS
SEARCH DETAIL
...