Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
Add more filters










Publication year range
1.
Philos Trans R Soc Lond B Biol Sci ; 379(1908): 20230254, 2024 Aug 26.
Article in English | MEDLINE | ID: mdl-39005038

ABSTRACT

Sound serves as a potent medium for emotional well-being, with phenomena like the autonomous sensory meridian response (ASMR) showing a unique capacity for inducing relaxation and alleviating stress. This study aimed to understand how tingling sensations (and, for comparison, pleasant feelings) that such videos induce relate to acoustic features, using a broader range of ASMR videos as stimuli. The sound texture statistics and their timing predictive of tingling and pleasantness were identified through L1-regularized linear regression. Tingling was well-predicted (r = 0.52), predominantly by the envelope of frequencies near 5 kHz in the 1500 to 750 ms period before the response: stronger tingling was associated with a lower amplitude around the 5 kHz frequency range. This finding was further validated using an independent set of ASMR sounds. The prediction of pleasantness was more challenging (r = 0.26), requiring a longer effective time window, threefold that for tingling. These results enhance our understanding of how specific acoustic elements can induce tingling sensations, and how these elements differ from those that induce pleasant feelings. Our findings have potential applications in optimizing ASMR stimuli to improve quality of life and alleviate stress and anxiety, thus expanding the scope of ASMR stimulus production beyond traditional methods. This article is part of the theme issue 'Sensing and feeling: an integrative approach to sensory processing and emotional experience'.


Subject(s)
Emotions , Humans , Male , Emotions/physiology , Female , Adult , Young Adult , Pleasure/physiology , Acoustic Stimulation , Sound , Meridians , Auditory Perception , Sensation/physiology
2.
Cereb Cortex ; 33(19): 10441-10452, 2023 09 26.
Article in English | MEDLINE | ID: mdl-37562851

ABSTRACT

Attention levels fluctuate during the course of daily activities. However, factors underlying sustained attention are still unknown. We investigated mechanisms of sustained attention using psychological, neuroimaging, and neurochemical approaches. Participants were scanned with functional magnetic resonance imaging (fMRI) while performing gradual-onset, continuous performance tasks (gradCPTs). In gradCPTs, narrations or visual scenes gradually changed from one to the next. Participants pressed a button for frequent Go trials as quickly as possible and withheld responses to infrequent No-go trials. Performance was better for the visual gradCPT than for the auditory gradCPT, but the 2 were correlated. The dorsal attention network was activated during intermittent responses, regardless of sensory modality. Reaction-time variability of gradCPTs was correlated with signal changes (SCs) in the left fronto-parietal regions. We also used magnetic resonance spectroscopy (MRS) to measure levels of glutamate-glutamine (Glx) and γ-aminobutyric acid (GABA) in the left prefrontal cortex (PFC). Glx levels were associated with performance under undemanding situations, whereas GABA levels were related to performance under demanding situations. Combined fMRI-MRS results demonstrated that SCs of the left PFC were positively correlated with neurometabolite levels. These findings suggest that a neural balance between excitation and inhibition is involved in attentional fluctuations and brain dynamics.


Subject(s)
Glutamic Acid , Glutamine , Humans , Glutamic Acid/analysis , Magnetic Resonance Imaging/methods , Magnetic Resonance Spectroscopy , Prefrontal Cortex , gamma-Aminobutyric Acid/analysis
3.
J Neurosci ; 43(21): 3876-3894, 2023 05 24.
Article in English | MEDLINE | ID: mdl-37185101

ABSTRACT

Natural sounds contain rich patterns of amplitude modulation (AM), which is one of the essential sound dimensions for auditory perception. The sensitivity of human hearing to AM measured by psychophysics takes diverse forms depending on the experimental conditions. Here, we address with a single framework the questions of why such patterns of AM sensitivity have emerged in the human auditory system and how they are realized by our neural mechanisms. Assuming that optimization for natural sound recognition has taken place during human evolution and development, we examined its effect on the formation of AM sensitivity by optimizing a computational model, specifically, a multilayer neural network, for natural sound (namely, everyday sounds and speech sounds) recognition and simulating psychophysical experiments in which the AM sensitivity of the model was assessed. Relatively higher layers in the model optimized to sounds with natural AM statistics exhibited AM sensitivity similar to that of humans, although the model was not designed to reproduce human-like AM sensitivity. Moreover, simulated neurophysiological experiments on the model revealed a correspondence between the model layers and the auditory brain regions. The layers in which human-like psychophysical AM sensitivity emerged exhibited substantial neurophysiological similarity with the auditory midbrain and higher regions. These results suggest that human behavioral AM sensitivity has emerged as a result of optimization for natural sound recognition in the course of our evolution and/or development and that it is based on a stimulus representation encoded in the neural firing rates in the auditory midbrain and higher regions.SIGNIFICANCE STATEMENT This study provides a computational paradigm to bridge the gap between the behavioral properties of human sensory systems as measured in psychophysics and neural representations as measured in nonhuman neurophysiology. This was accomplished by combining the knowledge and techniques in psychophysics, neurophysiology, and machine learning. As a specific target modality, we focused on the auditory sensitivity to sound AM. We built an artificial neural network model that performs natural sound recognition and simulated psychophysical and neurophysiological experiments in the model. Quantitative comparison of a machine learning model with human and nonhuman data made it possible to integrate the knowledge of behavioral AM sensitivity and neural AM tunings from the perspective of optimization to natural sound recognition.


Subject(s)
Auditory Cortex , Sound , Humans , Auditory Perception/physiology , Brain/physiology , Hearing , Mesencephalon/physiology , Acoustic Stimulation , Auditory Cortex/physiology
4.
PLoS One ; 17(10): e0276205, 2022.
Article in English | MEDLINE | ID: mdl-36264952

ABSTRACT

Understanding temporally attention fluctuations can benefit scientific knowledge and real-life applications. Temporal attention studies have typically used the reaction time (RT), which can be measured only after a target presentation, as an index of attention level. We have proposed the Micro-Pupillary Unrest Index (M-PUI) based on pupillary fluctuation amplitude to estimate RT before the target presentation. However, the kind of temporal attention effects that the M-PUI reflects remains unclear. We examined if the M-PUI shows two types of temporal attention effects initially reported for RTs in the variable foreperiod tasks: the variable foreperiod effect (FP effect) and the sequential effect (SE effect). The FP effect refers to a decrease in the RT due to an increase in the foreperiod of the current trial, whereas the SE effect refers to an increase in the RT in the early part of the foreperiod of the current trial due to an increase in the foreperiod of the previous trial. We used a simple reaction task with the medium-term variable foreperiods (Psychomotor Vigilance Task) and found that the M-PUI primarily reflects the FP effect. Inter-individual analyses showed that the FP effect on the M-PUI, unlike other eye movement indices, is correlated with the FP effect on RT. These results suggest that the M-PUI is a potentially powerful tool for investigating temporal attention fluctuations for a partly unpredictable target.


Subject(s)
Eye Movements , Wakefulness , Reaction Time , Psychomotor Performance
5.
Front Neurosci ; 16: 816735, 2022.
Article in English | MEDLINE | ID: mdl-35368290

ABSTRACT

Achievement of task performance is required to maintain a constant level of attention. Attentional level fluctuates over the course of daily activities. However, brain dynamics leading to attentional fluctuation are still unknown. We investigated the underlying mechanisms of sustained attention using functional magnetic resonance imaging (fMRI). Participants were scanned with fMRI while performing an auditory, gradual-onset, continuous performance task (gradCPT). In this task, narrations gradually changed from one to the next. Participants pressed a button for frequent Go trials (i.e., male voices) as quickly as possible and withheld responses to infrequent No-go trials (i.e., female voices). Event-related analysis revealed that frontal and temporal areas, including the auditory cortex, were activated during successful and unsuccessful inhibition of predominant responses. Reaction-time (RT) variability throughout the auditory gradCPT was positively correlated with signal changes in regions of the dorsal attention network: superior frontal gyrus and superior parietal lobule. Energy landscape analysis showed that task-related activations could be clustered into different attractors: regions of the dorsal attention network and default mode network. The number of alternations between RT-stable and erratic periods increased with an increase in transitions between attractors in the brain. Therefore, we conclude that dynamic transitions between brain states are closely linked to auditory attentional fluctuations.

6.
PLoS One ; 16(9): e0256953, 2021.
Article in English | MEDLINE | ID: mdl-34534237

ABSTRACT

Our daily activities require vigilance. Therefore, it is useful to externally monitor and predict our vigilance level using a straightforward method. It is known that the vigilance level is linked to pupillary fluctuations via Locus Coeruleus and Norepinephrine (LC-NE) system. However, previous methods of estimating long-term vigilance require monitoring pupillary fluctuations at rest over a long period. We developed a method of predicting the short-term vigilance level by monitoring pupillary fluctuation for a shorter period consisting of several seconds. The LC activity also fluctuates at a timescale of seconds. Therefore, we hypothesized that the short-term vigilance level could be estimated using pupillary fluctuations in a short period and quantified their amplitude as the Micro-Pupillary Unrest Index (M-PUI). We found an intra-individual trial-by-trial positive correlation between Reaction Time (RT) reflecting the short-term vigilance level and M-PUI in the period immediately before the target onset in a Psychomotor Vigilance Task (PVT). This relationship was most evident when the fluctuation was smoothed by a Hanning window of approximately 50 to 100 ms (including cases of down-sampled data at 100 and 50 Hz), and M-PUI was calculated in the period up to one or two seconds before the target onset. These results suggest that M-PUI can monitor and predict fluctuating levels of vigilance. M-PUI is also useful for examining pupillary fluctuations in a short period for elucidating the psychophysiological mechanisms of short-term vigilance.


Subject(s)
Arousal/physiology , Locus Coeruleus/physiology , Psychomotor Performance/physiology , Reflex, Pupillary/physiology , Wakefulness/physiology , Adult , Female , Humans , Male , Norepinephrine/physiology , Pupil/physiology , Reaction Time/physiology , Time Factors
8.
Q J Exp Psychol (Hove) ; 74(4): 705-715, 2021 Apr.
Article in English | MEDLINE | ID: mdl-33103992

ABSTRACT

Sustained attention plays an important role in adaptive behaviours in everyday activities. As previous studies have mostly focused on vision, and attentional resources have been thought to be specific to sensory modalities, it is still unclear how mechanisms of attentional fluctuations overlap between visual and auditory modalities. To reduce the effects of sudden stimulus onsets, we developed a new gradual-onset continuous performance task (gradCPT) in the auditory domain and compared dynamic fluctuation of sustained attention in vision and audition. In the auditory gradCPT, participants were instructed to listen to a stream of narrations and judge the gender of each narration. In the visual gradCPT, they were asked to observe a stream of scenery images and indicate whether the scene was a city or mountain. Our within-individual comparison revealed that auditory and visual attention are similar in terms of the false alarm rate and dynamic properties including fluctuation frequency. Absolute timescales of the fluctuation in the two modalities were comparable, notwithstanding the difference in stimulus onset asynchrony. The results suggest that fluctuations of visual and auditory attention are underpinned by common principles and support models with a more central, modality-general controller.


Subject(s)
Auditory Perception , Humans , Neuropsychological Tests
9.
J Neurosci ; 39(28): 5517-5533, 2019 07 10.
Article in English | MEDLINE | ID: mdl-31092586

ABSTRACT

The auditory system converts the physical properties of a sound waveform to neural activities and processes them for recognition. During the process, the tuning to amplitude modulation (AM) is successively transformed by a cascade of brain regions. To test the functional significance of the AM tuning, we conducted single-unit recording in a deep neural network (DNN) trained for natural sound recognition. We calculated the AM representation in the DNN and quantitatively compared it with those reported in previous neurophysiological studies. We found that an auditory-system-like AM tuning emerges in the optimized DNN. Better-recognizing models showed greater similarity to the auditory system. We isolated the factors forming the AM representation in the different brain regions. Because the model was not designed to reproduce any anatomical or physiological properties of the auditory system other than the cascading architecture, the observed similarity suggests that the AM tuning in the auditory system might also be an emergent property for natural sound recognition during evolution and development.SIGNIFICANCE STATEMENT This study suggests that neural tuning to amplitude modulation may be a consequence of the auditory system evolving for natural sound recognition. We modeled the function of the entire auditory system; that is, recognizing sounds from raw waveforms with as few anatomical or physiological assumptions as possible. We analyzed the model using single-unit recording, which enabled a fair comparison with neurophysiological data with as few methodological biases as possible. Interestingly, our results imply that frequency decomposition in the inner ear might not be necessary for processing amplitude modulation. This implication could not have been obtained if we had used a model that assumes frequency decomposition.


Subject(s)
Auditory Perception , Models, Neurological , Neural Networks, Computer , Brain/physiology , Humans , Sound
10.
Cereb Cortex ; 28(12): 4424-4439, 2018 12 01.
Article in English | MEDLINE | ID: mdl-30272122

ABSTRACT

Tonotopy is an essential functional organization in the mammalian auditory cortex, and its source in the primary auditory cortex (A1) is the incoming frequency-related topographical projections from the ventral division of the medial geniculate body (MGv). However, circuits that relay this functional organization to higher-order regions such as the secondary auditory field (A2) have yet to be identified. Here, we discovered a new pathway that projects directly from MGv to A2 in mice. Tonotopy was established in A2 even when primary fields including A1 were removed, which indicates that tonotopy in A2 can be established solely by thalamic input. Moreover, the structural nature of differing thalamocortical connections was consistent with the functional organization of the target regions in the auditory cortex. Retrograde tracing revealed that the region of MGv input to a local area in A2 was broader than the region of MGv input to A1. Consistent with this anatomy, two-photon calcium imaging revealed that neuronal responses in the thalamocortical recipient layer of A2 showed wider bandwidth and greater heterogeneity of the best frequency distribution than those of A1. The current study demonstrates a new thalamocortical pathway that relays frequency information to A2 on the basis of the MGv compartmentalization.


Subject(s)
Auditory Cortex/cytology , Auditory Cortex/physiology , Auditory Perception/physiology , Geniculate Bodies/cytology , Geniculate Bodies/physiology , Neurons/cytology , Neurons/physiology , Acoustic Stimulation , Animals , Auditory Pathways/cytology , Auditory Pathways/physiology , Male , Mice, Inbred C57BL , Neuroanatomical Tract-Tracing Techniques
11.
Network ; 20(4): 253-67, 2009.
Article in English | MEDLINE | ID: mdl-19919283

ABSTRACT

Sparse coding and its related theories have been successful to explain various response properties of early stages of sensory information processing such as primary visual cortex and peripheral auditory system, which suggests that the emergence of such properties results from adaptation of the nerve system to natural stimuli. The present study continues this line of research in a higher stage of auditory processing, focusing on harmonic structures that are often found in behaviourally important natural sound like animal vocalization. It has been physiologically shown that monkey primary auditory cortices (A1) have neurons with response properties capturing such harmonic structures: their response and modulation peaks are often found at frequencies that are harmonically related to each other. We hypothesize that such relations emerge from sparse coding of harmonic natural sounds. Our simulation shows that similar harmonic relations emerge from frequency-domain sparse codes of harmonic sounds, namely, piano performance and human speech. Moreover, the modulatory behaviours can be explained by competitive interactions of model neurons that capture partially common harmonic structures.


Subject(s)
Auditory Cortex/physiology , Auditory Perception/physiology , Pitch Perception/physiology , Vocalization, Animal/physiology , Acoustics , Action Potentials/physiology , Animals , Auditory Pathways/physiology , Electrophysiology , Evoked Potentials, Auditory/physiology , Haplorhini , Humans , Models, Neurological , Neurons/physiology , Orientation/physiology , Perceptual Masking/physiology , Signal Processing, Computer-Assisted , Sound , Sound Localization/physiology , Sound Spectrography , Speech Acoustics , Speech Perception/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...