Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
Exp Brain Res ; 240(1): 249-261, 2022 Jan.
Article in English | MEDLINE | ID: mdl-34727219

ABSTRACT

Visual-spatial selective attention enhances the processing of task-relevant visual events while suppressing the processing of irrelevant ones. In this study, we employed a frequency-tagging paradigm to investigate how sustained visual-spatial attention modulates the first harmonic and second harmonic steady-state visual evoked potentials (SSVEPs). Unlike previous studies, that investigated stimulation durations of 10 s or less, we tested a 30-s period. SSVEPs were elicited by simultaneously presenting to the right and left visual hemifields two pattern reversal checkerboard stimuli modulating at 7.14 Hz and 11.11 Hz. Participants were cued to selectively attend to one visual hemifield while ignoring the other. Behavioral results indicated that participants selectively attended to the cued visual hemifield. When participants attended to the visual stimuli, there were larger second harmonic SSVEPs but no attentional modulation of first harmonics. The results are consistent with the proposal that neural populations underlying first, and second harmonics have distinct functional roles, i.e., first harmonics' mechanisms preserve stimulus properties and are resistant to attentional gain, whereas second harmonics mediate attentional modulation. This interpretation is supported by a gain control theory of selective attention.


Subject(s)
Electroencephalography , Evoked Potentials, Visual , Attention , Cues , Evoked Potentials , Humans , Photic Stimulation
2.
Psychophysiology ; 58(1): e13686, 2021 01.
Article in English | MEDLINE | ID: mdl-33141450

ABSTRACT

The Attentional Blink (AB) usually refers to the impaired report of a second target (T2) if it appears within 200-500 ms after a first target within a rapid sequence of distractors. The present study focused on a less studied AB variant known as the unmasked AB, where T2 is the last item of the sequence and T2 report is unaffected. This aspect of the unmasked AB holds promise for an experimental paradigm in which measures of on-going event-related processing are unconfounded by differences in late-stage processing. To fully characterize the unmasked AB paradigm, we used a randomization statistics approach to comprehensively examine the electroencephalographic signature of the unmasked AB. We examined the unmasked AB with auditory and visual T2s-participants attended to either the auditory or visual information within a sequence of paired auditory-visual stimuli, and reported targets within the attended modality stream while ignoring the other. As predicted, T2 report was unaffected by the unmasked AB. The visual AB was associated with delayed but intact N2 and P3 components, and a suppressed N1. We suggest that this N1 is linked to auditory processing of the distractor stream, and reflects the cognitive system prioritizing the processing of visual targets over auditory distractors in response to AB-related processing load. The auditory AB only indicated a delayed but intact P3. Collectively, these findings support the view that the AB limits the entry of information into consciousness via a late-stage modal bottleneck, and suggest an ongoing compensatory response at early latencies.


Subject(s)
Attentional Blink/physiology , Cerebral Cortex/physiology , Evoked Potentials/physiology , Pattern Recognition, Visual/physiology , Speech Perception/physiology , Adult , Electroencephalography , Event-Related Potentials, P300/physiology , Female , Humans , Male , Time Factors , Young Adult
3.
Cortex ; 117: 1-15, 2019 08.
Article in English | MEDLINE | ID: mdl-30925308

ABSTRACT

Our study proposes a test of a key assumption of the most prominent model of consciousness - the global workspace (GWS) model (e.g., Baars, 2002, 2005, 2007; Dehaene & Naccache, 2001; Mudrik, Faivre, & Koch, 2014). This assumption is that multimodal integration requires consciousness; however, few studies have explicitly tested if integration can occur between nonconscious information from different modalities. The proposed study examined whether a classic indicator of multimodal integration - the McGurk effect - can be elicited with subliminal auditory-visual speech stimuli. We used a masked speech priming paradigm developed by Kouider and Dupoux (2005) in conjunction with continuous flash suppression (CFS; Tsuchiya & Koch, 2005), a binocular rivalry technique for presenting video stimuli subliminally. Applying these techniques together, we carried out two experiments in which participants categorised auditory syllable targets which were preceded by subliminal auditory-visual (AV) speech primes. Subliminal AV primes were either illusion-inducing (McGurk) or illusion-neutral (Incongruent) combinations of speech stimuli. In Experiment 1, the categorisation of the syllable target ("pa") was facilitated by the same syllable prime when it was part of a McGurk combination (auditory "pa" and visual "ka") but not when part of an Incongruent combination (auditory "pa" and visual "wa"). This dependency on specific AV combinations indicated a nonconscious AV interaction. Experiment 2 presented a different syllable target ("ta") which matched the predicted illusory outcome of the McGurk combination - here, both the McGurk combination (auditory "pa" and visual "ka") and the Incongruent combination (auditory "ta" and visual "ka") failed to facilitate target categorisation. The combined results of both Experiments demonstrate a type of nonconscious multimodal interaction that is distinct from integration - it allows unimodal information that is compatible for integration (i.e., McGurk combinations) to persist and influence later processes, but does not actually combine and alter that information. As the GWS model does not account for non-integrative multimodal interactions, this places some pressure on such models of consciousness.


Subject(s)
Auditory Perception/physiology , Awareness/physiology , Consciousness/physiology , Visual Perception/physiology , Acoustic Stimulation , Humans , Photic Stimulation , Reaction Time/physiology , Subliminal Stimulation
4.
Front Psychol ; 9: 2071, 2018.
Article in English | MEDLINE | ID: mdl-30416477

ABSTRACT

We present the first neurophysiological signatures showing distinctive effects of group social context and emotional arousal on cultural perceptions, such as the efficacy of religious rituals. Using a novel protocol, EEG data were simultaneously recorded from ethnic Chinese religious believers in group and individual settings as they rated the perceived efficacy of low, medium, and high arousal spirit-medium rituals presented as video clips. Neural oscillatory patterns were then analyzed for these perceptual judgements, categorized as low, medium, and high efficacy. The results revealed distinct neural signatures and behavioral patterns between the experimental conditions. Arousal levels predicted ratings of ritual efficacy. Increased efficacy was marked by suppressed alpha and beta power, regardless of group or individual setting. In groups, efficacy ratings converged. Individual setting showed increased within-participant phase synchronization in alpha and beta bands, while group setting enhanced between-participant theta phase synchronization. This reflected group participants' orientation toward a common perspective and social coordination. These findings suggest that co-presence in groups leads to a social-tuning effect supported by between-participant theta phase synchrony. Together these neural synchrony patterns reveal how collective rituals have both individual and communal dimensions. The emotionality of spirit-medium rituals drives individual perceptions of efficacy, while co-presence in groups signals the significance of an event and socially tunes enhanced agreement in perceptual ratings. In other words, mass gatherings may foster social cohesion without necessarily requiring group-size scaling limitations of direct face-to-face interaction. This could have implications for the scaling computability of synchrony in large groups as well as for humanistic studies in areas such as symbolic interactionism.

5.
PLoS One ; 11(2): e0148332, 2016.
Article in English | MEDLINE | ID: mdl-26866807

ABSTRACT

The purpose of the study is to examine the effect of subliminal priming in terms of the perception of images influenced by words with positive, negative, and neutral emotional content, through electroencephalograms (EEGs). Participants were instructed to rate how much they like the stimuli images, on a 7-point Likert scale, after being subliminally exposed to masked lexical prime words that exhibit positive, negative, and neutral connotations with respect to the images. Simultaneously, the EEGs were recorded. Statistical tests such as repeated measures ANOVAs and two-tailed paired-samples t-tests were performed to measure significant differences in the likability ratings among the three prime affect types; the results showed a strong shift in the likeness judgment for the images in the positively primed condition compared to the other two. The acquired EEGs were examined to assess the difference in brain activity associated with the three different conditions. The consistent results obtained confirmed the overall priming effect on participants' explicit ratings. In addition, machine learning algorithms such as support vector machines (SVMs), and AdaBoost classifiers were applied to infer the prime affect type from the ERPs. The highest classification rates of 95.0% and 70.0% obtained respectively for average-trial binary classifier and average-trial multi-class further emphasize that the ERPs encode information about the different kinds of primes.


Subject(s)
Electroencephalography , Evoked Potentials/physiology , Machine Learning , Pattern Recognition, Visual/physiology , Subliminal Stimulation , Support Vector Machine , Adult , Affect , Algorithms , Analysis of Variance , Electroencephalography/methods , Emotions , Female , Humans , Judgment , Language , Male , Models, Statistical , Normal Distribution , Perception , Reaction Time , Young Adult
6.
Cogn Affect Behav Neurosci ; 13(1): 80-93, 2013 Mar.
Article in English | MEDLINE | ID: mdl-23224782

ABSTRACT

Past research has identified an event-related potential (ERP) marker for vocal emotional encoding and has highlighted vocal-processing differences between male and female listeners. We further investigated this ERP vocal-encoding effect in order to determine whether it predicts voice-related changes in listeners' memory for verbal interaction content. Additionally, we explored whether sex differences in vocal processing would affect such changes. To these ends, we presented participants with a series of neutral words spoken with a neutral or a sad voice. The participants subsequently encountered these words, together with new words, in a visual word recognition test. In addition to making old/new decisions, the participants rated the emotional valence of each test word. During the encoding of spoken words, sad voices elicited a greater P200 in the ERP than did neutral voices. While the P200 effect was unrelated to a subsequent recognition advantage for test words previously heard with a neutral as compared to a sad voice, the P200 did significantly predict differences between these words in a concurrent late positive ERP component. Additionally, the P200 effect predicted voice-related changes in word valence. As compared to words studied with a neutral voice, words studied with a sad voice were rated more negatively, and this rating difference was larger, the larger the P200 encoding effect was. While some of these results were comparable in male and female participants, the latter group showed a stronger P200 encoding effect and qualitatively different ERP responses during word retrieval. Estrogen measurements suggested the possibility that these sex differences have a genetic basis.


Subject(s)
Brain/physiology , Emotions/physiology , Evoked Potentials/physiology , Memory/physiology , Sex Characteristics , Speech Perception/physiology , Adult , Electroencephalography , Female , Humans , Male , Recognition, Psychology/physiology , Voice
7.
Soc Neurosci ; 6(3): 219-30, 2011.
Article in English | MEDLINE | ID: mdl-20711939

ABSTRACT

Being touched by another person influences our readiness to empathize with and support that person. We asked whether this influence arises from somatosensory experience, the proximity to the person and/or an attribution of the somatosensory experience to the person. Moreover, we were interested in whether and how touch affects the processing of ensuing events. To this end, we presented neutral and negative pictures with or without gentle pressure to the participants' forearm. In Experiment 1, pressure was applied by a friend, applied by a tactile device and attributed to the friend, or applied by a tactile device and attributed to a computer. Across these conditions, touch enhanced event-related potential (ERP) correlates of picture processing. Pictures elicited a larger posterior N100 and a late positivity discriminated more strongly between pictures of neutral and negative content when participants were touched. Experiment 2 replicated these findings while controlling for the predictive quality of touch. Experiment 3 replaced tactile contact with a tone, which failed to enhance N100 amplitude and emotion discrimination reflected by the late positivity. This indicates that touch sensitizes ongoing cognitive and emotional processes and that this sensitization is mediated by bottom-up somatosensory processing. Moreover, touch seems to be a special sensory signal that influences recipients in the absence of conscious reflection and that promotes prosocial behavior.


Subject(s)
Attention/physiology , Emotions/physiology , Evoked Potentials, Visual/physiology , Nonverbal Communication/physiology , Touch Perception/physiology , Discrimination, Psychological/physiology , Empathy/physiology , Female , Friends , Humans , Photic Stimulation/methods , Physical Stimulation , Social Behavior , Somatosensory Cortex/physiology , Touch , Visual Cortex/physiology , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...