Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
Add more filters










Publication year range
3.
Soc Cogn Affect Neurosci ; 18(1)2023 10 31.
Article in English | MEDLINE | ID: mdl-37769357

ABSTRACT

Emotion recognition (ER) declines with increasing age, yet little is known whether this observation is based on structural brain changes conveyed by differential atrophy. To investigate whether age-related ER decline correlates with reduced grey matter (GM) volume in emotion-related brain regions, we conducted a voxel-based morphometry analysis using data of the Human Connectome Project-Aging (N = 238, aged 36-87) in which facial ER was tested. We expected to find brain regions that show an additive or super-additive age-related change in GM volume indicating atrophic processes that reduce ER in older adults. The data did not support our hypotheses after correction for multiple comparisons. Exploratory analyses with a threshold of P < 0.001 (uncorrected), however, suggested that relationships between GM volume and age-related general ER may be widely distributed across the cortex. Yet, small effect sizes imply that only a small fraction of the decline of ER in older adults can be attributed to local GM volume changes in single voxels or their multivariate patterns.


Subject(s)
Longevity , Magnetic Resonance Imaging , Humans , Aged , Brain/diagnostic imaging , Brain/pathology , Gray Matter/diagnostic imaging , Gray Matter/pathology , Emotions , Atrophy
4.
Hum Brain Mapp ; 42(8): 2416-2433, 2021 06 01.
Article in English | MEDLINE | ID: mdl-33605509

ABSTRACT

Higher impulsivity may arise from neurophysiological deficits of cognitive control in the prefrontal cortex. Cognitive control can be assessed by time-frequency decompositions of electrophysiological data. We aimed to clarify neuroelectric mechanisms of performance monitoring in connection with impulsiveness during a modified Eriksen flanker task in high- (n = 24) and low-impulsive subjects (n = 21) and whether these are modulated by double-blind, sham-controlled intermittent theta burst stimulation (iTBS). We found a larger error-specific peri-response beta power decrease over fronto-central sites in high-impulsive compared to low-impulsive participants, presumably indexing less effective motor execution processes. Lower parieto-occipital theta intertrial phase coherence (ITPC) preceding correct responses predicted higher reaction time (RT) and higher RT variability, potentially reflecting efficacy of cognitive control or general attention. Single-trial preresponse theta phase clustering was coupled to RT in correct trials (weighted ITPC), reflecting oscillatory dynamics that predict trial-specific behavior. iTBS did not modulate behavior or EEG time-frequency power. Performance monitoring was associated with time-frequency patterns reflecting cognitive control (parieto-occipital theta ITPC, theta weighted ITPC) as well as differential action planning/execution processes linked to trait impulsivity (frontal low beta power). Beyond that, results suggest no stimulation effect related to response-locked time-frequency dynamics with the current stimulation protocol. Neural oscillatory responses to performance monitoring differ between high- and low-impulsive individuals, but are unaffected by iTBS.


Subject(s)
Cerebral Cortex/physiology , Electroencephalography , Executive Function/physiology , Impulsive Behavior/physiology , Psychomotor Performance/physiology , Theta Rhythm/physiology , Transcranial Magnetic Stimulation , Adult , Attention/physiology , Double-Blind Method , Female , Humans , Male , Reaction Time/physiology , Young Adult
5.
Elife ; 92020 12 15.
Article in English | MEDLINE | ID: mdl-33319749

ABSTRACT

To form a more reliable percept of the environment, the brain needs to estimate its own sensory uncertainty. Current theories of perceptual inference assume that the brain computes sensory uncertainty instantaneously and independently for each stimulus. We evaluated this assumption in four psychophysical experiments, in which human observers localized auditory signals that were presented synchronously with spatially disparate visual signals. Critically, the visual noise changed dynamically over time continuously or with intermittent jumps. Our results show that observers integrate audiovisual inputs weighted by sensory uncertainty estimates that combine information from past and current signals consistent with an optimal Bayesian learner that can be approximated by exponential discounting. Our results challenge leading models of perceptual inference where sensory uncertainty estimates depend only on the current stimulus. They demonstrate that the brain capitalizes on the temporal dynamics of the external world and estimates sensory uncertainty by combining past experiences with new incoming sensory signals.


Subject(s)
Auditory Perception/physiology , Brain/physiology , Uncertainty , Visual Perception/physiology , Adolescent , Adult , Bayes Theorem , Female , Humans , Male , Middle Aged , Noise , Psychophysics , Young Adult
6.
Q J Exp Psychol (Hove) ; 73(12): 2260-2271, 2020 Dec.
Article in English | MEDLINE | ID: mdl-32698727

ABSTRACT

Our senses are stimulated continuously. Through multisensory integration, different sensory inputs may or may not be combined into a unitary percept. Simultaneous with this stimulation, people are frequently engaged in social interactions, but how multisensory integration and social processing interact is largely unknown. The present study investigated if, and how, the multisensory sound-induced flash illusion is affected by a social manipulation. In the sound-induced flash illusion, a participant typically receives one visual flash and two auditory beeps and she or he is required to indicate the number of flashes that were perceived. Often, the auditory beeps alter the perception of the flashes such that a participant tends to perceive two flashes instead of one flash. We tested whether performing a flash counting task with a partner (confederate), who was required to indicate the number of presented beeps, would modulate this illusion. We found that the sound-induced flash illusion was perceived significantly more often when the flash counting task was performed with the confederate compared with performing it alone. Yet, we no longer find this effect if visual access between the two individuals is prevented. These findings, combined with previous results, suggest that performing a multisensory task jointly-in this case an audiovisual task-lowers the extent to which an individual attends to visual information, which in turn affects the multisensory integration process.


Subject(s)
Illusions , Acoustic Stimulation , Auditory Perception , Female , Humans , Male , Photic Stimulation , Visual Perception
7.
Nat Commun ; 10(1): 1907, 2019 04 23.
Article in English | MEDLINE | ID: mdl-31015423

ABSTRACT

Transforming the barrage of sensory signals into a coherent multisensory percept relies on solving the binding problem - deciding whether signals come from a common cause and should be integrated or, instead, segregated. Human observers typically arbitrate between integration and segregation consistent with Bayesian Causal Inference, but the neural mechanisms remain poorly understood. Here, we presented people with audiovisual sequences that varied in the number of flashes and beeps, then combined Bayesian modelling and EEG representational similarity analyses. Our data suggest that the brain initially represents the number of flashes and beeps independently. Later, it computes their numbers by averaging the forced-fusion and segregation estimates weighted by the probabilities of common and independent cause models (i.e. model averaging). Crucially, prestimulus oscillatory alpha power and phase correlate with observers' prior beliefs about the world's causal structure that guide their arbitration between sensory integration and segregation.


Subject(s)
Auditory Perception/physiology , Models, Neurological , Neocortex/physiology , Sensation/physiology , Visual Perception/physiology , Acoustic Stimulation , Adult , Bayes Theorem , Cues , Electroencephalography , Female , Humans , Male , Middle Aged , Neocortex/anatomy & histology , Photic Stimulation
8.
Sci Rep ; 8(1): 12376, 2018 08 17.
Article in English | MEDLINE | ID: mdl-30120294

ABSTRACT

Information integration across the senses is fundamental for effective interactions with our environment. The extent to which signals from different senses can interact in the absence of awareness is controversial. Combining the spatial ventriloquist illusion and dynamic continuous flash suppression (dCFS), we investigated in a series of two experiments whether visual signals that observers do not consciously perceive can influence spatial perception of sounds. Importantly, dCFS obliterated visual awareness only on a fraction of trials allowing us to compare spatial ventriloquism for physically identical flashes that were judged as visible or invisible. Our results show a stronger ventriloquist effect for visible than invisible flashes. Critically, a robust ventriloquist effect emerged also for invisible flashes even when participants were at chance when locating the flash. Collectively, our findings demonstrate that signals that we are not aware of in one sensory modality can alter spatial perception of signals in another sensory modality.


Subject(s)
Auditory Perception/physiology , Visual Perception/physiology , Acoustic Stimulation , Adolescent , Adult , Female , Humans , Illusions , Male , Photic Stimulation , Young Adult
9.
eNeuro ; 5(1)2018.
Article in English | MEDLINE | ID: mdl-29527567

ABSTRACT

Behaviorally, it is well established that human observers integrate signals near-optimally weighted in proportion to their reliabilities as predicted by maximum likelihood estimation. Yet, despite abundant behavioral evidence, it is unclear how the human brain accomplishes this feat. In a spatial ventriloquist paradigm, participants were presented with auditory, visual, and audiovisual signals and reported the location of the auditory or the visual signal. Combining psychophysics, multivariate functional MRI (fMRI) decoding, and models of maximum likelihood estimation (MLE), we characterized the computational operations underlying audiovisual integration at distinct cortical levels. We estimated observers' behavioral weights by fitting psychometric functions to participants' localization responses. Likewise, we estimated the neural weights by fitting neurometric functions to spatial locations decoded from regional fMRI activation patterns. Our results demonstrate that low-level auditory and visual areas encode predominantly the spatial location of the signal component of a region's preferred auditory (or visual) modality. By contrast, intraparietal sulcus forms spatial representations by integrating auditory and visual signals weighted by their reliabilities. Critically, the neural and behavioral weights and the variance of the spatial representations depended not only on the sensory reliabilities as predicted by the MLE model but also on participants' modality-specific attention and report (i.e., visual vs. auditory). These results suggest that audiovisual integration is not exclusively determined by bottom-up sensory reliabilities. Instead, modality-specific attention and report can flexibly modulate how intraparietal sulcus integrates sensory signals into spatial representations to guide behavioral responses (e.g., localization and orienting).


Subject(s)
Attention/physiology , Auditory Perception/physiology , Brain/physiology , Visual Perception/physiology , Adult , Brain/diagnostic imaging , Brain Mapping , Female , Humans , Magnetic Resonance Imaging , Male , Psychophysics , Young Adult
10.
Front Hum Neurosci ; 12: 540, 2018.
Article in English | MEDLINE | ID: mdl-30692922

ABSTRACT

The World Health Organization has defined health as "complete physical, mental and social well-being and not merely the absence of disease or infirmity" (World Health Organization, 1948). An increasing number of studies have therefore started to investigate "the good life." However, the underlying variation in brain activity has rarely been examined. The goal of this study was to assess differences in resting state functional connectivity (RSFC) between regular healthy individuals and healthy individuals with a high occurrence of flourishing and subjective vitality. Together, flourishing, a broad measure of psycho-social functioning and subjective vitality, an organismic marker of subjective well-being comprise the phenomenological opposite of a major depressive disorder. Out of a group of 43 participants, 20 high-flourishing (highFl) and 18 high-vital (highSV) individuals underwent a 7-min resting state period, where cortical activity in posterior brain areas was assessed using functional near-infrared spectroscopy (fNIRS). Network-based statistics (NBS) of FC yielded significantly different FC patterns for the highFl and highSV individuals compared to their healthy comparison group. The networks converged at areas of the posterior default mode network and differed in hub nodes in the left middle temporal/fusiform gyrus (flourishing) and the left primary/secondary somatosensory cortex (subjective vitality). The attained networks are discussed with regard to recent neuroscientific findings for other well-being measures and potential mechanisms of action based on social information processing and body-related self-perception.

11.
Neurosci Lett ; 649: 34-40, 2017 05 10.
Article in English | MEDLINE | ID: mdl-28347858

ABSTRACT

BACKGROUND: The relationship between task-positive and task-negative components of brain networks has repeatedly been shown to be characterized by dissociated fluctuations of spontaneous brain activity. We tested whether the interaction between task-positive and task-negative brain areas during resting-state predicts higher interference susceptibility, i.e. increased reaction times (RTs), during an Attention Modulation by Salience Task (AMST). METHODS: 29 males underwent 3T resting-state Magnetic Resonance Imaging scanning. Subsequently, they performed the AMST, which measures RTs to early- and late-onset auditory stimuli while perceiving high- or low-salient visual distractors. We conducted seed-based resting-state functional connectivity (rsFC) analyses using global signal correction. We assessed general responsiveness and salience related interference in the AMST and set this into context of the resting-state functional connectivity (rsFC) between a key salience network region (dACC; task-positive region) and a key default mode network region (precuneus; task-negative region). RESULTS: With increasing RTs to high- but not low-salient pictures dACC shows significantly weakened functional dissociation to a cluster in precuneus. This cluster overlaps with a cluster that correlates in its dACC rsFC with subjects' interference, as measured of high-salient RTs relative to low-salient RTs. CONCLUSION: Our findings suggest that the interaction between salience network (SN) and default mode network (DMN) at rest predicts susceptibility to distraction. Subjects, that are more susceptible to high-salient stimuli - task-irrelevant external information - showed increased dACC rsFC toward precuneus. This is consistent with prior work in individuals with impaired attentional focus. Future studies might help to conclude whether an increased rsFC between a SN region and DMN region may serve as a predictor for clinical syndromes characterized by attentional impairments, e.g. ADHD. This could lead to an alternative, objective diagnosis and treatment of such disorders by decreasing the rsFC of these regions.


Subject(s)
Attention/physiology , Gyrus Cinguli/physiology , Reaction Time , Adult , Brain/physiology , Brain Mapping , Humans , Magnetic Resonance Imaging , Male , Neural Pathways/physiology , Psychomotor Performance
12.
Curr Biol ; 26(4): 509-14, 2016 Feb 22.
Article in English | MEDLINE | ID: mdl-26853368

ABSTRACT

Human observers typically integrate sensory signals in a statistically optimal fashion into a coherent percept by weighting them in proportion to their reliabilities. An emerging debate in neuroscience is to which extent multisensory integration emerges already in primary sensory areas or is deferred to higher-order association areas. This fMRI study used multivariate pattern decoding to characterize the computational principles that define how auditory and visual signals are integrated into spatial representations across the cortical hierarchy. Our results reveal small multisensory influences that were limited to a spatial window of integration in primary sensory areas. By contrast, parietal cortices integrated signals weighted by their sensory reliabilities and task relevance in line with behavioral performance and principles of statistical optimality. Intriguingly, audiovisual integration in parietal cortices was attenuated for large spatial disparities when signals were unlikely to originate from a common source. Our results demonstrate that multisensory interactions in primary and association cortices are governed by distinct computational principles. In primary visual cortices, spatial disparity controlled the influence of non-visual signals on the formation of spatial representations, whereas in parietal cortices, it determined the influence of task-irrelevant signals. Critically, only parietal cortices integrated signals weighted by their bottom-up reliabilities and top-down task relevance into multisensory spatial priority maps to guide spatial orienting.


Subject(s)
Auditory Perception , Parietal Lobe/physiology , Space Perception , Visual Cortex/physiology , Visual Perception , Adult , Female , Humans , Magnetic Resonance Imaging , Male , Young Adult
13.
J Vis ; 15(5): 22, 2015.
Article in English | MEDLINE | ID: mdl-26067540

ABSTRACT

To obtain a coherent percept of the environment, the brain should integrate sensory signals from common sources and segregate those from independent sources. Recent research has demonstrated that humans integrate audiovisual information during spatial localization consistent with Bayesian Causal Inference (CI). However, the decision strategies that human observers employ for implicit and explicit CI remain unclear. Further, despite the key role of sensory reliability in multisensory integration, Bayesian CI has never been evaluated across a wide range of sensory reliabilities. This psychophysics study presented participants with spatially congruent and discrepant audiovisual signals at four levels of visual reliability. Participants localized the auditory signals (implicit CI) and judged whether auditory and visual signals came from common or independent sources (explicit CI). Our results demonstrate that humans employ model averaging as a decision strategy for implicit CI; they report an auditory spatial estimate that averages the spatial estimates under the two causal structures weighted by their posterior probabilities. Likewise, they explicitly infer a common source during the common-source judgment when the posterior probability for a common source exceeds a fixed threshold of 0.5. Critically, sensory reliability shapes multisensory integration in Bayesian CI via two distinct mechanisms: First, higher sensory reliability sensitizes humans to spatial disparity and thereby sharpens their multisensory integration window. Second, sensory reliability determines the relative signal weights in multisensory integration under the assumption of a common source. In conclusion, our results demonstrate that Bayesian CI is fundamental for integrating signals of variable reliabilities.


Subject(s)
Auditory Perception/physiology , Perceptual Masking/physiology , Visual Perception/physiology , Acoustic Stimulation/methods , Adult , Bayes Theorem , Female , Humans , Male , Psychophysics , Reproducibility of Results , Young Adult
14.
J Neurosci ; 35(14): 5655-63, 2015 Apr 08.
Article in English | MEDLINE | ID: mdl-25855179

ABSTRACT

Emotions can be aroused by various kinds of stimulus modalities. Recent neuroimaging studies indicate that several brain regions represent emotions at an abstract level, i.e., independently from the sensory cues from which they are perceived (e.g., face, body, or voice stimuli). If emotions are indeed represented at such an abstract level, then these abstract representations should also be activated by the memory of an emotional event. We tested this hypothesis by asking human participants to learn associations between emotional stimuli (videos of faces or bodies) and non-emotional stimuli (fractals). After successful learning, fMRI signals were recorded during the presentations of emotional stimuli and emotion-associated fractals. We tested whether emotions could be decoded from fMRI signals evoked by the fractal stimuli using a classifier trained on the responses to the emotional stimuli (and vice versa). This was implemented as a whole-brain searchlight, multivoxel activation pattern analysis, which revealed successful emotion decoding in four brain regions: posterior cingulate cortex (PCC), precuneus, MPFC, and angular gyrus. The same analysis run only on responses to emotional stimuli revealed clusters in PCC, precuneus, and MPFC. Multidimensional scaling analysis of the activation patterns revealed clear clustering of responses by emotion across stimulus types. Our results suggest that PCC, precuneus, and MPFC contain representations of emotions that can be evoked by stimuli that carry emotional information themselves or by stimuli that evoke memories of emotional stimuli, while angular gyrus is more likely to take part in emotional memory retrieval.


Subject(s)
Association Learning , Brain Mapping , Brain/physiology , Concept Formation/physiology , Emotions/physiology , Adult , Analysis of Variance , Brain/blood supply , Facial Expression , Female , Humans , Imaging, Three-Dimensional , Magnetic Resonance Imaging , Male , Movement/physiology , Oxygen/blood , Photic Stimulation , Young Adult
16.
PLoS Biol ; 13(2): e1002073, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25710328

ABSTRACT

To form a veridical percept of the environment, the brain needs to integrate sensory signals from a common source but segregate those from independent sources. Thus, perception inherently relies on solving the "causal inference problem." Behaviorally, humans solve this problem optimally as predicted by Bayesian Causal Inference; yet, the underlying neural mechanisms are unexplored. Combining psychophysics, Bayesian modeling, functional magnetic resonance imaging (fMRI), and multivariate decoding in an audiovisual spatial localization task, we demonstrate that Bayesian Causal Inference is performed by a hierarchy of multisensory processes in the human brain. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation). At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion). Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the causal structure of the world is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference. Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex. It unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition. Our results demonstrate how the brain combines information in the face of uncertainty about the underlying causal structure of the world.


Subject(s)
Auditory Perception/physiology , Nerve Net/physiology , Neural Pathways/physiology , Psychomotor Performance/physiology , Visual Perception/physiology , Acoustic Stimulation , Adult , Auditory Cortex/anatomy & histology , Auditory Cortex/physiology , Bayes Theorem , Brain Mapping , Cognition/physiology , Eye Movements/physiology , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Parietal Lobe/anatomy & histology , Parietal Lobe/physiology , Photic Stimulation , Psychophysics , Reaction Time , Visual Cortex/anatomy & histology , Visual Cortex/physiology
17.
Front Neurosci ; 7: 116, 2013.
Article in English | MEDLINE | ID: mdl-23882174

ABSTRACT

Conventional neuroimaging techniques provide information about condition-related changes of the BOLD (blood-oxygen-level dependent) signal, indicating only where and when the underlying cognitive processes occur. Recently, with the help of a new approach called "model-based" functional neuroimaging (fMRI), researchers are able to visualize changes in the internal variables of a time varying learning process, such as the reward prediction error or the predicted reward value of a conditional stimulus. However, despite being extremely beneficial to the imaging community in understanding the neural correlates of decision variables, a model-based approach to brain imaging data is also methodologically challenging due to the multicollinearity problem in statistical analysis. There are multiple sources of multicollinearity in functional neuroimaging including investigations of closely related variables and/or experimental designs that do not account for this. The source of multicollinearity discussed in this paper occurs due to correlation between different subjective variables that are calculated very close in time. Here, we review methodological approaches to analyzing such data by discussing the special case of separating the reward prediction error signal from reward outcomes.

18.
Eur J Neurosci ; 36(3): 2376-82, 2012 Aug.
Article in English | MEDLINE | ID: mdl-22595033

ABSTRACT

The representation of reward anticipation and reward prediction errors is the basis for reward-associated learning. The representation of whether or not a reward occurred (reward receipt) is important for decision making. Recent studies suggest that, while reward anticipation and reward prediction errors are encoded in the midbrain and the ventral striatum, reward receipts are encoded in the medial orbitofrontal cortex. In order to substantiate this functional specialization we analyzed data from an fMRI study in which 59 subjects completed two simple monetary reward paradigms. Because reward receipts and reward prediction errors were correlated, a statistical model comparison was applied separating the effects of the two. Reward prediction error fitted BOLD responses significantly better than reward receipt in the midbrain and the ventral striatum. Conversely, reward receipt fitted BOLD responses better in the orbitofrontal cortex. Activation related to reward anticipation was found in the orbitofrontal cortex. The results confirm a functional specialization of behaviorally important aspects of reward processing within the mesolimbic dopaminergic system.


Subject(s)
Brain/physiology , Decision Making/physiology , Reward , Adult , Female , Humans , Magnetic Resonance Imaging , Male , Neuropsychological Tests
19.
Neuroimage ; 50(3): 1168-76, 2010 Apr 15.
Article in English | MEDLINE | ID: mdl-20083206

ABSTRACT

Reward processing is a central component of learning and decision making. Functional magnetic resonance imaging (fMRI) has contributed essentially to our understanding of reward processing in humans. The strength of reward-related brain responses might prove as a valuable marker for, or correlate of, individual preferences or personality traits. An essential prerequisite for this is a sufficient reliability of individual measures of reward-related brain signals. We therefore determined test-retest reliabilities of BOLD responses to reward prediction, reward receipt and reward prediction errors in the ventral striatum and the orbitofrontal cortex in 25 subjects undergoing three different simple reward paradigms (retest interval 7-13 days). Although on a group level the paradigms consistently led to significant activations of the relevant brain areas in two sessions, across-subject retest reliabilities were only poor to fair (with intraclass correlation coefficients (ICCs) of -0.15 to 0.44). ICCs for motor activations were considerably higher (ICCs 0.32 to 0.73). Our results reveal the methodological difficulties behind across-subject correlations in fMRI research on reward processing. These results demonstrate the need for studies that address methods to optimize the retest reliability of fMRI.


Subject(s)
Brain/physiology , Magnetic Resonance Imaging/methods , Oxygen/blood , Reward , Adult , Basal Ganglia/blood supply , Basal Ganglia/physiology , Brain/blood supply , Cognition/physiology , Female , Frontal Lobe/blood supply , Frontal Lobe/physiology , Humans , Male , Neuropsychological Tests , Reaction Time , Reproducibility of Results , Surveys and Questionnaires , Time Factors , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...