Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
Cephalalgia ; 44(2): 3331024241230279, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38416486

ABSTRACT

BACKGROUND: To date, a number of studies on migraine have cross-sectionally evaluated sensory sensitivity with aversion thresholds/scores along the migraine cycle, reporting a decreased tolerance to sensory stimuli in different sensory modalities. Our hypothesis was that patients with migraine would exhibit heightened sensitivity to sound, light, touch and smell on days where they reported greater headache intensity. METHODS: This is an exploratory, longitudinal study, carried out over the course of 27 days. Aversion thresholds or scores to sound, light, touch and smell were quantified in six patients with migraine (11.33 ± 6.53 headache days/month). RESULTS: Patients reported an increased sensitivity to light (padj = 0.0297), touch (padj = 0.0077), and smell (padj = 0.0201) on days with higher headache intensity. However, a greater sensitivity to sound on days with higher headache intensity was only reported when anxiety levels were high (padj = 1.4e-06). Interestingly, variable levels of tolerance to bothersome light over time can also influence the correlation between light sensitivity and headache intensity (padj = 1.4e-06). CONCLUSIONS: Based on the present findings, future longitudinal studies evaluating sensory threshold changes along the migraine cycle in patients with migraine should account for the increased tolerance to bothersome light over time as well as the effect of anxiety on auditory sensitivity.


Subject(s)
Migraine Disorders , Touch Perception , Humans , Longitudinal Studies , Headache , Sensory Thresholds
2.
J Headache Pain ; 24(1): 104, 2023 Aug 07.
Article in English | MEDLINE | ID: mdl-37545005

ABSTRACT

BACKGROUND: Migraine is a cyclic, neurosensory disorder characterized by recurrent headaches and altered sensory processing. The latter is manifested in hypersensitivity to visual stimuli, measured with questionnaires and sensory thresholds, as well as in abnormal cortical excitability and a lack of habituation, assessed with visual evoked potentials elicited by pattern-reversal stimulation. Here, the goal was to determine whether factors such as age and/or disease severity may exert a modulatory influence on sensory sensitivity, cortical excitability, and habituation. METHODS: Two similar experiments were carried out, the first comparing 24 young, episodic migraine patients and 28 healthy age- and gender-matched controls and the second 36 middle-aged, episodic migraine patients and 30 healthy age- and gender-matched controls. A neurologist confirmed the diagnoses. Migraine phases were obtained using eDiaries. Sensory sensitivity was assessed with the Sensory Perception Quotient and group comparisons were carried out. We obtained pattern-reversal visual evoked potentials and calculated the N1-P1 Peak-to-Peak amplitude. Two linear mixed-effects models were fitted to these data. The first model had Block (first block, last block) and Group (patients, controls) as fixed factors, whereas the second model had Trial (all trials) and Group as fixed factors. Participant was included as a random factor in both. N1-P1 first block amplitude was used to assess cortical excitability and habituation was defined as a decrease of N1-P1 amplitude across Blocks/Trials. Both experiments were performed interictally. RESULTS: The final samples consisted of 18 patients with episodic migraine and 27 headache-free controls (first experiment) and 19 patients and 29 controls (second experiment). In both experiments, patients reported increased visual hypersensitivity on the Sensory Perception Quotient as compared to controls. Regarding N1-P1 peak-to-peak data, there was no main effect of Group, indicating no differences in cortical excitability between groups. Finally, significant main effects of both Block and Trial were found indicating habituation in both groups, regardless of age and headache frequency. CONCLUSIONS: The results of this study yielded evidence for significant hypersensitivity in patients but no significant differences in either habituation or cortical excitability, as compared to headache-free controls. Although the alterations in patients may be less pronounced than originally anticipated they demonstrate the need for the definition and standardization of optimal methodological parameters.


Subject(s)
Evoked Potentials, Visual , Migraine Disorders , Humans , Middle Aged , Habituation, Psychophysiologic/physiology , Headache , Patient Acuity , Case-Control Studies
3.
Cephalalgia ; 42(13): 1305-1316, 2022 11.
Article in English | MEDLINE | ID: mdl-35815637

ABSTRACT

BACKGROUND: Past studies do not account for avoidance behaviour in migraine as a potential confounder of phonophobia. OBJECTIVE: To analyse whether phonophobia is partially driven by avoidance behaviour when using the classic methodology (method of limits). METHODS: This is a case-control study where we tested phonophobia in a cohort of high-frequency/chronic migraine patients (15.5 ± 0.74 headache days/month) and non-headache controls. Auditory stimuli, delivered in both ears, were presented using three different paradigms: the method of limits, the method of constant stimuli, and the adaptive method. Participants were asked to report how bothersome each tone was until a sound aversion threshold was estimated for each method. RESULTS: In this study, we successfully replicate previously reported reduction in sound aversion threshold using three different methods in a group of 35 patients and 25 controls (p < 0.0001). Avoidance behaviour in migraine reduced sound aversion threshold in the method of limits (p = 0.0002) and the adaptive method (p < 0.0001) when compared to the method of constant stimuli. While thresholds in controls remained the same across methods (method of limits, p = 0.9877 and adaptive method, p = 1). CONCLUSION: Avoidance behaviour can exacerbate phonophobia. The current methodology to measure phonophobia needs to be revised.


Subject(s)
Hyperacusis , Migraine Disorders , Humans , Case-Control Studies , Avoidance Learning
4.
Eur J Neurosci ; 49(2): 150-164, 2019 01.
Article in English | MEDLINE | ID: mdl-30270546

ABSTRACT

In everyday life multisensory events, such as a glass crashing on the floor, the different sensory inputs are often experienced as simultaneous, despite the sensory processing of sound and sight within the brain are temporally misaligned. This lack of cross-modal synchrony is the unavoidable consequence of different light and sound speeds, and their different neural transmission times in the corresponding sensory pathways. Hence, cross-modal synchrony must be reconstructed during perception. It has been suggested that spontaneous fluctuations in neural excitability might be involved in the temporal organisation of sensory events during perception and account for variability in behavioural performance. Here, we addressed the relationship between ongoing brain oscillations and the perception of cross-modal simultaneity. Participants performed an audio-visual simultaneity judgement task while their EEG was recorded. We focused on pre-stimulu activity, and found that the phase of neural oscillations at 13 ± 2 Hz 200 ms prior to the stimulus correlated with subjective simultaneity of otherwise identical sound-flash events. Remarkably, the correlation between EEG phase and behavioural report occurred in the absence of concomitant changes in EEG amplitude. The probability of simultaneity perception fluctuated significantly as a function of pre-stimulus phase, with the largest perceptual variation being accounted for phase angles nearly 180º apart. This pattern was strongly reliable for sound-flash pairs but not for flash-sound pairs. Overall, these findings suggest that the phase of ongoing brain activity might underlie internal states of the observer that influence cross-modal temporal organisation between the senses and, in turn, subjective synchrony.


Subject(s)
Auditory Perception/physiology , Brain Waves , Brain/physiology , Visual Perception/physiology , Acoustic Stimulation , Adolescent , Adult , Female , Humans , Judgment/physiology , Male , Photic Stimulation , Young Adult
5.
Front Integr Neurosci ; 10: 44, 2016.
Article in English | MEDLINE | ID: mdl-28154529

ABSTRACT

Perception in multi-sensory environments involves both grouping and segregation of events across sensory modalities. Temporal coincidence between events is considered a strong cue to resolve multisensory perception. However, differences in physical transmission and neural processing times amongst modalities complicate this picture. This is illustrated by cross-modal recalibration, whereby adaptation to audio-visual asynchrony produces shifts in perceived simultaneity. Here, we examined whether voluntary actions might serve as a temporal anchor to cross-modal recalibration in time. Participants were tested on an audio-visual simultaneity judgment task after an adaptation phase where they had to synchronize voluntary actions with audio-visual pairs presented at a fixed asynchrony (vision leading or vision lagging). Our analysis focused on the magnitude of cross-modal recalibration to the adapted audio-visual asynchrony as a function of the nature of the actions during adaptation, putatively fostering cross-modal grouping or, segregation. We found larger temporal adjustments when actions promoted grouping than segregation of sensory events. However, a control experiment suggested that additional factors, such as attention to planning/execution of actions, could have an impact on recalibration effects. Contrary to the view that cross-modal temporal organization is mainly driven by external factors related to the stimulus or environment, our findings add supporting evidence for the idea that perceptual adjustments strongly depend on the observer's inner states induced by motor and cognitive demands.

6.
PLoS One ; 9(7): e99311, 2014.
Article in English | MEDLINE | ID: mdl-25004132

ABSTRACT

Temporal recalibration of cross-modal synchrony has been proposed as a mechanism to compensate for timing differences between sensory modalities. However, far from the rich complexity of everyday life sensory environments, most studies to date have examined recalibration on isolated cross-modal pairings. Here, we hypothesize that selective attention might provide an effective filter to help resolve which stimuli are selected when multiple events compete for recalibration. We addressed this question by testing audio-visual recalibration following an adaptation phase where two opposing audio-visual asynchronies were present. The direction of voluntary visual attention, and therefore to one of the two possible asynchronies (flash leading or flash lagging), was manipulated using colour as a selection criterion. We found a shift in the point of subjective audio-visual simultaneity as a function of whether the observer had focused attention to audio-then-flash or to flash-then-audio groupings during the adaptation phase. A baseline adaptation condition revealed that this effect of endogenous attention was only effective toward the lagging flash. This hints at the role of exogenous capture and/or additional endogenous effects producing an asymmetry toward the leading flash. We conclude that selective attention helps promote selected audio-visual pairings to be combined and subsequently adjusted in time but, stimulus organization exerts a strong impact on recalibration. We tentatively hypothesize that the resolution of recalibration in complex scenarios involves the orchestration of top-down selection mechanisms and stimulus-driven processes.


Subject(s)
Attention , Acoustic Stimulation , Adaptation, Psychological , Adolescent , Adult , Auditory Perception , Humans , Photic Stimulation , Reaction Time , Time Perception , Visual Perception , Young Adult
7.
Int J Psychophysiol ; 89(1): 136-47, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23797145

ABSTRACT

Audiovisual speech perception has been frequently studied considering phoneme, syllable and word processing levels. Here, we examined the constraints that visual speech information might exert during the recognition of words embedded in a natural sentence context. We recorded event-related potentials (ERPs) to words that could be either strongly or weakly predictable on the basis of the prior semantic sentential context and, whose initial phoneme varied in the degree of visual saliency from lip movements. When the sentences were presented audio-visually (Experiment 1), words weakly predicted from semantic context elicited a larger long-lasting N400, compared to strongly predictable words. This semantic effect interacted with the degree of visual saliency over a late part of the N400. When comparing audio-visual versus auditory alone presentation (Experiment 2), the typical amplitude-reduction effect over the auditory-evoked N100 response was observed in the audiovisual modality. Interestingly, a specific benefit of high- versus low-visual saliency constraints occurred over the early N100 response and at the late N400 time window, confirming the result of Experiment 1. Taken together, our results indicate that the saliency of visual speech can exert an influence over both auditory processing and word recognition at relatively late stages, and thus suggest strong interactivity between audio-visual integration and other (arguably higher) stages of information processing during natural speech comprehension.


Subject(s)
Recognition, Psychology/physiology , Speech Perception/physiology , Visual Perception/physiology , Acoustic Stimulation , Adolescent , Adult , Comprehension , Data Interpretation, Statistical , Electroencephalography , Evoked Potentials, Auditory/physiology , Female , Fixation, Ocular , Humans , Male , Phonetics , Photic Stimulation , Psycholinguistics , Reading , Semantics , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...