Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 628
Filter
1.
Front Psychiatry ; 15: 1428425, 2024.
Article in English | MEDLINE | ID: mdl-39371911

ABSTRACT

Background: Major depressive disorder (MDD) is associated with deficits in cognitive function, thought to be related to underlying decreased hedonic experiences. Further research is needed to fully elucidate the role of functional brain activity in this relationship. In this study, we investigated the neurofunctional correlate of the interplay between cognitive function and hedonic experiences in medication-free MDD using functional near-infrared spectroscopy (fNIRS). Methods: We examine differences of brain activation corresponding to the verbal fluency test (VFT) between MDD patients and healthy controls (HCs). Fifty-six MDD patients and 35 HCs underwent fMRI scanning while performing the VFT. In exploratory analyses, cognitive performance, as assessed by the Cambridge Neuropsychological Test Automated Battery (CANTAB), four dimensions of hedonic processing (desire, motivation, effort, and consummatory pleasure) measured by the Dimensional Anhedonia Rating Scale (DARS), and relative changes in oxygenated hemoglobin concentration during the VFT were compared across groups. Results: Patients with MDD demonstrated impairments in sustained attention and working memory, accompanied by lower total and subscale scores on the DARS. Compared to healthy controls, MDD patients exhibited reduced activation in the prefrontal cortex (PFC) during the VFT task (t = 2.32 to 4.77, p < 0.001 to 0.02, FDR corrected). DARS motivation, desire, and total scores as well as sustained attention, were positively correlated with activation in the dorsolateral PFC and Broca's area (p < 0.05, FDR corrected). Conclusions: These findings indicate that changes in prefrontal lobe oxygenated hemoglobin levels, a region implicated in hedonic motivation and cognitive function, may serve as potential biomarkers for interventions targeting individuals with MDD. Our results corroborate the clinical consensus that the prefrontal cortex is a primary target for non-invasive neuromodulatory treatments for depression.

2.
Neurosci Lett ; 842: 137998, 2024 Sep 27.
Article in English | MEDLINE | ID: mdl-39343192

ABSTRACT

Recent studies have prompted a shift in the understanding of attention deficit hyperactivity disorder (ADHD) from models positing dysfunction of individual brain areas to those that assume alterations in large-scale brain networks. Despite this shift, the underlying neural mechanism of ADHD in the adult population remains uncertain. With functional magnetic resonance imaging (fMRI), this study examined brain connectivity of dorsal and ventral attention networks. Adults with and without ADHD completed a Go/No-Go task inside the scanner and the functional connectivity of attention networks was analysed. The generalized psychophysiological interaction analysis indicated differences involving the dorsal attention network. For the ADHD group, an interaction effect revealed altered dorsal attention-default mode network connectivity modulation, particularly between the right frontal eye field and posterior cingulate gyrus. We conclude that dorsal attention network dysfunction may be involved in sustained attention deficits in adult-ADHD. This study sheds light into network-level alterations contributing to the understanding of adult-ADHD, which may be a potential avenue for future research and clinical interventions.

3.
Neurobiol Aging ; 144: 93-103, 2024 Sep 17.
Article in English | MEDLINE | ID: mdl-39298870

ABSTRACT

Sustained attention is important for maintaining cognitive function and autonomy during ageing, yet older people often show reductions in this domain. The role of the underlying neurobiology is not yet well understood, with most neuroimaging studies primarily focused on fMRI. Here, we utilise sMRI to investigate the relationships between age, structural brain volumes and sustained attention performance. Eighty-nine healthy older adults (50-84 years, Mage 65.5 (SD=8.4) years, 74 f) underwent MRI brain scanning and completed two sustained attention tasks: a rapid visual information processing (RVP) task and sustained attention to response task (SART). Independent hierarchical linear regressions demonstrated that greater volumes of white matter hyperintensities (WMH) were associated with worse RVP_A' performance, whereas greater grey matter volumes were associated with better RVP_A' performance. Further, greater cerebral white matter volumes were associated with better SART_d' performance. Importantly, mediation analyses revealed that both grey and white matter volumes completely mediated the relationship between ageing and sustained attention. These results explain disparate attentional findings in older adults, highlighting the intervening role of brain structure.

4.
Behav Processes ; 222: 105097, 2024 Sep 19.
Article in English | MEDLINE | ID: mdl-39299355

ABSTRACT

The ability of nervous systems to filter out irrelevant and repetitive stimuli may prevent animals from becoming 'saturated' with excess information. However, animals must be particular about which stimuli to attend to and which to ignore, as mistakes may be costly. Using a comparative approach, we explored the effect of interstimulus interval (ISI) between repeated presentations of visual stimuli presented on a screen to test the decrease in responses (response decrement) of both Trite planiceps jumping spiders and untrained Columba livia pigeons, animals with comparable visual ability despite having structurally different visual systems and brain size. We used ISIs of 2.5 s, 5 s, 10 s, predicting that decreases in ISI would lead to progressively less responses to the stimuli. Following from previous work on T. planiceps, we also manipulated pigeon hunger level, finding that hungry birds were initially more responsive than sated pigeons, but the rate of decrease in responses to the stimulus did not differ between the two groups. While a clear response decrement was seen in both species across all conditions, shorter ISIs resulted in more dramatic response decrements, aligning with previous work and with the resource depletion theory posited in the human-based literature.

5.
Front Psychol ; 15: 1448226, 2024.
Article in English | MEDLINE | ID: mdl-39301008

ABSTRACT

Three experiments (N = 336) examined whether participants can systematically adjust levels of mind wandering on command. Participants performed four blocks of the metronome response task (MRT) in which they pressed a spacebar in sync with a steady audio tone. Levels of spontaneous and deliberate mind wandering were measured using intermittent thought probes. Performance was indexed with MRT response time variability and omission errors. Each block started with instructions to mind wander either 20, 40, 60, or 80% of the time. Analysis was primarily conducted using linear mixed effects models. We found that mind wandering (spontaneous and deliberate), response time variability, and omission errors increased progressively with instructions to mind wander more and that these instruction-related changes were larger for deliberate than spontaneous mind wandering (Experiments 1-3). This pattern held regardless of whether participants' eyes were open or shut (Experiment 2). Relative to a control group receiving no commands to mind wander, instructing people to mind wander 60 or 80% of the time led to more deliberate mind wandering, and strikingly, asking people to mind wander 20% of the time led to less spontaneous mind wandering (Experiment 3). Our results suggest that individuals can titrate mind wandering experiences to roughly match instructed levels indicating that mind wandering can be manipulated through simple instructions. However, other features of the data suggest that such titration is effortful and may come with a cost to performance.

6.
Psychon Bull Rev ; 2024 Sep 16.
Article in English | MEDLINE | ID: mdl-39285130

ABSTRACT

Sustained attention fluctuates over time, affecting task-related processing and memory. However, it is less clear how attentional state affects processing and memory when images are accompanied by irrelevant visual information. We first quantify behavioral signatures of attentional state in an online sample (N1=92) and demonstrate that images presented in high attentional states are better remembered. Next, we test how sustained attention influences memory in two online samples (N2=188, N3=185) when task-irrelevant images are present. We show that high attention leads to better memory for both task-relevant and task-irrelevant images. This suggests that sustained attentional state does selectively affect processing for task-relevant information, but rather affects processing broadly, regardless of task relevance. Finally, we show that other components of attention such as selective attention contribute to the mnemonic fate of stimuli. Our findings highlight the necessity of considering and characterizing attention's unique components and their effects on cognition.

7.
Physiol Behav ; 287: 114666, 2024 Aug 30.
Article in English | MEDLINE | ID: mdl-39216809

ABSTRACT

INTRODUCTION: Exposure to moderate levels of simulated hypoxia has subtle cognitive effects relative to ground level, in healthy individuals. However, there are few data on the cognitive consequences of the combination of hypoxia and partial sleep deprivation, which is a classic military or civilian operational context. In this study, we tested the hypothesis that exposure to moderate hypoxia while sleep-restricted impairs several domains of cognition, and we also assessed physiological parameters and salivary concentrations of cortisol and alpha-amylase. METHOD: Seventeen healthy males completed two sessions of cognitive tests (sustained attention using the PVT psychomotor vigilance task and executive functions using the Go-NoGo inhibition task and N-Back working memory task) after 30 min (T + 30') and 4 h (T + 240') of exposure in a normobaric hypoxic tent (FIO2 = 13.6 %, ≃ 3,500 m) (HY). This was completed after one night of sleep restriction (3 a.m. to 6 a.m. bedtime, SRHY) and one night of habitual sleep (10 p.m. to 6 a.m. bedtime, HSHY) (with cross-over randomization). The two nights sleep architecture and physiological parameters (oxygen saturation (SpO2) and heart rate (HR) during T + 30' and T + 240'sessions were analyzed. Salivary cortisol and alpha-amylase (sAA) concentrations were analyzed before hypoxia, after the T + 30' and T + 240' cognitive sessions, and after leaving the hypoxic tent. RESULTS: Sustained attention (RT and number of lapses in the PVT) and executive functions (Go-NoGo and 1-Back and 2-Back parameters, as inhibition and working memory signatures) were impaired in the SRHY condition compared to HSHY. SpO2 and HR were higher after 4 h compared with 30 min of hypoxia in the HSHY condition, while only HR was statistically higher in the SRHY condition. In SRHY, salivary AA concentration was lower and cortisol was higher than in HSHY. A significant increase in sAA concentration is observed after the cognitive session at 4 h of hypoxia exposure compared to that at 30 min, only in the SRHY condition. There are significant positive correlations between reaction time and the corresponding heart rate (a non-invasive marker of physiological stress) for the executive tasks in the two sleep conditions. This was not observed for salivary levels of sAA and cortisol, respective reliable indicators of the sympathoadrenomedullary system and the hypothalamic-pituitary adrenocortical system. CONCLUSION: Exposure to moderate normobaric hypoxia (≃ 3500 m / ≃ 11,500 ft simulated) after a single night of 3-hour sleep impairs cognitive performance after 30 min and 4 h of exposure. The key determinants and/or mechanism(s) responsible for cognitive impairment when exposed to moderate hypoxia with sleep restriction, particularly on the executive function, have yet to be elucidated.

8.
Dev Psychobiol ; 66(7): e22538, 2024 Nov.
Article in English | MEDLINE | ID: mdl-39192662

ABSTRACT

Most studies of developing visual attention are conducted using screen-based tasks in which infants move their eyes to select where to look. However, real-world visual exploration entails active movements of both eyes and head to bring relevant areas in view. Thus, relatively little is known about how infants coordinate their eyes and heads to structure their visual experiences. Infants were tested every 3 months from 9 to 24 months while they played with their caregiver and three toys while sitting in a highchair at a table. Infants wore a head-mounted eye tracker that measured eye movement toward each of the visual targets (caregiver's face and toys) and how targets were oriented within the head-centered field of view (FOV). With age, infants increasingly aligned novel toys in the center of their head-centered FOV at the expense of their caregiver's face. Both faces and toys were better centered in view during longer looking events, suggesting that infants of all ages aligned their eyes and head to sustain attention. The bias in infants' head-centered FOV could not be accounted for by manual action: Held toys were more poorly centered compared with non-held toys. We discuss developmental factors-attentional, motoric, cognitive, and social-that may explain why infants increasingly adopted biased viewpoints with age.


Subject(s)
Attention , Child Development , Eye Movements , Eye-Tracking Technology , Visual Perception , Humans , Attention/physiology , Infant , Male , Female , Child Development/physiology , Visual Perception/physiology , Eye Movements/physiology , Child, Preschool , Head Movements/physiology , Head/physiology
9.
J Neurotrauma ; 2024 Jul 12.
Article in English | MEDLINE | ID: mdl-38994598

ABSTRACT

Cholinergic disruptions underlie attentional deficits following traumatic brain injury (TBI). Yet, drugs specifically targeting acetylcholinesterase (AChE) inhibition have yielded mixed outcomes. Therefore, we hypothesized that galantamine (GAL), a dual-action competitive AChE inhibitor and α7 nicotinic acetylcholine receptor (nAChR) positive allosteric modulator, provided chronically after injury, will attenuate TBI-induced deficits of sustained attention and enhance ACh efflux in the medial prefrontal cortex (mPFC), as assessed by in vivo microdialysis. In Experiment 1, adult male rats (n = 10-15/group) trained in the 3-choice serial reaction time (3-CSRT) test were randomly assigned to controlled cortical impact (CCI) or sham surgery and administered GAL (0.5, 2.0, or 5.0 mg/kg; i.p.) or saline vehicle (VEH; 1 mL/kg; i.p) beginning 24-h post-surgery and once daily thereafter for 27 days. Measures of sustained attention and distractibility were assessed on post-operative days 21-25 in the 3-CSRT, following which cortical lesion volume and basal forebrain cholinergic cells were quantified on day 27. In Experiment 2, adult male rats (n = 3-4/group) received a CCI and 24 h later administered (i.p.) one of the three doses of GAL or VEH for 21 days to quantify the dose-dependent effect of GAL on in vivo ACh efflux in the mPFC. Two weeks after the CCI, a guide cannula was implanted in the right mPFC. On post-surgery day 21, baseline and post-injection dialysate samples were collected in a temporally matched manner with the cohort undergoing behavior. ACh levels were analyzed using reverse phase high-performance liquid chromatography (HPLC) coupled to an electrochemical detector. Cortical lesion volume was quantified on day 22. The data were subjected to ANOVA, with repeated measures where appropriate, followed by Newman-Keuls post hoc analyses. All TBI groups displayed impaired sustained attention versus the pooled SHAM controls (p's < 0.05). Moreover, the highest dose of GAL (5.0 mg/kg) exacerbated attentional deficits relative to VEH and the two lower doses of GAL (p's < 0.05). TBI significantly reduced cholinergic cells in the right basal forebrain, regardless of treatment condition, versus SHAM (p < 0.05). In vivo microdialysis revealed no differences in basal ACh in the mPFC; however, GAL (5.0 mg/kg) significantly increased ACh efflux 30 min following injection compared to the VEH and the other GAL (0.5 and 2.0 mg/kg) treated groups (p's < 0.05). In both experiments, there were no differences in cortical lesion volume across treatment groups (p's > 0.05). In summary, albeit the higher dose of GAL increased ACh release, it did not improve measures of sustained attention or histopathological markers, thereby partially supporting the hypothesis and providing the impetus for further investigations into alternative cholinergic pharmacotherapies such as nAChR positive allosteric modulators.

10.
Exp Brain Res ; 242(8): 2033-2040, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38958722

ABSTRACT

Researchers dispute the cause of errors in high Go, low No Go target detection tasks, like the Sustained Attention to Response Task (SART). Some researchers propose errors in the SART are due to perceptual decoupling, where a participant is unaware of stimulus identity. This lack of external awareness causes an erroneous response. Other researchers suggest the majority of the errors in the SART are instead due to response leniency, not perceptual decoupling. Response delays may enable a participant who is initially unaware of stimulus identity, perceptually decoupled, to become aware of stimulus identity, or perceptually recoupled. If, however, the stimulus presentation time is shortened to the minimum necessary for stimulus recognition and the stimulus is disrupted with a structured mask, then there should be no time to enable perception to recouple even with a response delay. From the perceptual decoupling perspective, there should be no impact of a response delay on performance in this case. Alternatively if response bias is critical, then even in this case a response delay may impact performance. In this study, we shortened stimulus presentation time and added a structured mask. We examined whether a response delay impacted performance in the SART and tasks where the SART's response format was reversed. We expected a response delay would only impact signal detection theory bias, c, in the SART, where response leniency is an issue. In the reverse formatted SART, since bias was not expected to be lenient, we expected no impact or minimal impact of a response delay on response bias. These predictions were verified. Response bias is more critical in understanding SART performance, than perceptual decoupling, which is rare if it occurs at all in the SART.


Subject(s)
Attention , Psychomotor Performance , Reaction Time , Humans , Attention/physiology , Female , Male , Young Adult , Adult , Reaction Time/physiology , Psychomotor Performance/physiology , Visual Perception/physiology , Adolescent , Photic Stimulation/methods
11.
Neuropharmacology ; 258: 110064, 2024 Nov 01.
Article in English | MEDLINE | ID: mdl-38981578

ABSTRACT

Nonmedical use of prescription opioids peaks during late adolescence, a developmental period associated with the maturation of higher-order cognitive processes. To date, however, how chronic adolescent oxycodone (OXY) self-administration alters neurobehavioral (i.e., locomotion, startle reactivity) and/or neurocognitive (i.e., preattentive processes, intrasession habituation, stimulus-reinforcement learning, sustained attention) function has not yet been systematically evaluated. Hence, the rationale was built for establishing the dose-dependency of adolescent OXY self-administration on the trajectory of neurobehavioral and neurocognitive development. From postnatal day (PD) 35 to PD 105, an age in rats that corresponds to the adolescent and young adult period in humans, male and female F344/N rats received access to either oral OXY (0, 2, 5, or 10 mg/kg) or water under a two-bottle choice experimental paradigm. Independent of biological sex or dose, rodents voluntarily escalated their OXY intake across ten weeks. A longitudinal experimental design revealed prominent OXY-induced impairments in neurobehavioral development, characterized by dose-dependent increases in locomotion and sex-dependent increases in startle reactivity. Systematic manipulation of the interstimulus interval in prepulse inhibition supports an OXY-induced impairment in preattentive processes. Despite the long-term cessation of OXY intake, rodents with a history of chronic adolescent oral OXY self-administration exhibited deficits in sustained attention; albeit no alterations in stimulus-reinforcement learning were observed. Taken together, adolescent oral OXY self-administration induces selective long-term alterations in neurobehavioral and neurocognitive development enjoining the implementation of safer prescribing guidelines for this population.


Subject(s)
Analgesics, Opioid , Oxycodone , Reflex, Startle , Self Administration , Animals , Oxycodone/administration & dosage , Oxycodone/adverse effects , Male , Female , Rats , Administration, Oral , Analgesics, Opioid/administration & dosage , Analgesics, Opioid/adverse effects , Reflex, Startle/drug effects , Dose-Response Relationship, Drug , Cognition/drug effects , Prepulse Inhibition/drug effects , Locomotion/drug effects , Attention/drug effects
12.
Sci Rep ; 14(1): 17001, 2024 07 24.
Article in English | MEDLINE | ID: mdl-39043835

ABSTRACT

The Continuous Visual Attention Test (CVAT) is a test that detects visuomotor reaction time (RT, alertness), variability of reaction time (VRT, sustained attention), omission errors (OE, focused attention), and commission errors (CE, response inhibition). The standard test takes 15 min, while the ultrafast version only 90 s. Besides overall task length, the two versions differ by target probability (20% and 80% in the 15-min vs. only 80% in the 90-s test) and stimulus-onset asynchrony (SOA) (1, 2, and 4 s in the 15-min vs. only 1 s in the 90-s test. We aimed to analyze the effect of target probability, SOA, and time length on the CVAT variables across the 15-min task and to verify correlations and agreements between the 15-min and the 90-s CVATs. 205 healthy participants performed the two CVATs on the same day. Considering the 15-min task, RT and CE were strongly affected by target probability. Conversely, VRT was not affected. When the 15-min task was compared to the 90-s task, we found no significant difference in the VRT variable. Additionally, a significant agreement between the two tasks was found for the VRT variable. We concluded that sustained attention can be measured with the 90-s CVAT.


Subject(s)
Attention , Reaction Time , Humans , Attention/physiology , Reaction Time/physiology , Male , Female , Adult , Young Adult , Psychomotor Performance/physiology , Neuropsychological Tests , Adolescent , Middle Aged , Visual Perception/physiology , Photic Stimulation
13.
Cereb Cortex ; 34(7)2024 Jul 03.
Article in English | MEDLINE | ID: mdl-39076112

ABSTRACT

Sustained attention, as the basis of general cognitive ability, naturally varies across different time scales, spanning from hours, e.g. from wakefulness to drowsiness state, to seconds, e.g. trial-by-trail fluctuation in a task session. Whether there is a unified mechanism underneath such trans-scale variability remains unclear. Here we show that fluctuation of cortical excitation/inhibition (E/I) is a strong modulator to sustained attention in humans across time scales. First, we observed the ability to attend varied across different brain states (wakefulness, postprandial somnolence, sleep deprived), as well as within any single state with larger swings. Second, regardless of the time scale involved, we found highly attentive state was always linked to more balanced cortical E/I characterized by electroencephalography (EEG) features, while deviations from the balanced state led to temporal decline in attention, suggesting the fluctuation of cortical E/I as a common mechanism underneath trans-scale attentional variability. Furthermore, we found the variations of both sustained attention and cortical E/I indices exhibited fractal structure in the temporal domain, exhibiting features of self-similarity. Taken together, these results demonstrate that sustained attention naturally varies across different time scales in a more complex way than previously appreciated, with the cortical E/I as a shared neurophysiological modulator.


Subject(s)
Attention , Cerebral Cortex , Electroencephalography , Wakefulness , Humans , Attention/physiology , Male , Female , Young Adult , Adult , Wakefulness/physiology , Cerebral Cortex/physiology , Neural Inhibition/physiology , Time Factors , Cortical Excitability/physiology , Sleep Deprivation/physiopathology
14.
Sci Rep ; 14(1): 17455, 2024 07 29.
Article in English | MEDLINE | ID: mdl-39075100

ABSTRACT

The first therapeutical goal followed by neurooncological surgeons dealing with prefrontal gliomas is attempting supramarginal tumor resection preserving relevant neurological function. Therefore, advanced knowledge of the frontal aslant tract (FAT) functional neuroanatomy in high-order cognitive domains beyond language and speech processing would help refine neurosurgeries, predicting possible relevant cognitive adverse events and maximizing the surgical efficacy. To this aim we performed the recently developed correlational tractography analyses to evaluate the possible relationship between FAT's microstructural properties and cognitive functions in 27 healthy subjects having ultra-high-field (7-Tesla) diffusion MRI. We independently assessed FAT segments innervating the dorsolateral prefrontal cortices (dlPFC-FAT) and the supplementary motor area (SMA-FAT). FAT microstructural robustness, measured by the tract's quantitative anisotropy (QA), was associated with a better performance in episodic memory, visuospatial orientation, cognitive processing speed and fluid intelligence but not sustained selective attention tests. Overall, the percentual tract volume showing an association between QA-index and improved cognitive scores (pQACV) was higher in the SMA-FAT compared to the dlPFC-FAT segment. This effect was right-lateralized for verbal episodic memory and fluid intelligence and bilateralized for visuospatial orientation and cognitive processing speed. Our results provide novel evidence for a functional specialization of the FAT beyond the known in language and speech processing, particularly its involvement in several higher-order cognitive domains. In light of these findings, further research should be encouraged to focus on neurocognitive deficits and their impact on patient outcomes after FAT damage, especially in the context of glioma surgery.


Subject(s)
Cognition , Diffusion Tensor Imaging , Humans , Male , Female , Cognition/physiology , Adult , Diffusion Tensor Imaging/methods , Diffusion Magnetic Resonance Imaging/methods , Middle Aged , Young Adult , Glioma/diagnostic imaging , Glioma/surgery , Glioma/pathology , Dorsolateral Prefrontal Cortex/diagnostic imaging
15.
Sleep Adv ; 5(1): zpae043, 2024.
Article in English | MEDLINE | ID: mdl-39036743

ABSTRACT

People with narcolepsy type 1 (NT1), narcolepsy type 2 (NT2), and idiopathic hypersomnia (IH) often report cognitive impairment which can be quite burdensome but is rarely evaluated in routine clinical practice. In this systematic review and meta-analysis, we assessed the nature and magnitude of cognitive impairment in NT1, NT2, and IH in studies conducted from January 2000 to October 2022. We classified cognitive tests assessing memory, executive function, and attention by cognitive domain. Between-group differences were analyzed as standardized mean differences (Cohen's d), and Cohen's d for individual tests were integrated according to cognitive domain and clinical disease group. Eighty-seven studies were screened for inclusion; 39 satisfied inclusion criteria, yielding 73 comparisons (k): NT1, k = 60; NT2, k = 8; IH, k = 5. Attention showed large impairment in people with NT1 (d = -0.90) and IH (d = -0.97), and moderate impairment in NT2 (d = -0.60). Executive function was moderately impaired in NT1 (d = -0.30) and NT2 (d = -0.38), and memory showed small impairments in NT1 (d = -0.33). A secondary meta-analysis identified sustained attention as the most impaired domain in NT1, NT2, and IH (d ≈ -0.5 to -1). These meta-analyses confirm that cognitive impairments are present in NT1, NT2, and IH, and provide quantitative confirmation of reports of cognitive difficulties made by patients and clinicians. These findings provide a basis for the future design of studies to determine whether cognitive impairments can improve with pharmacologic and nonpharmacologic treatments for narcolepsy and IH.

16.
Exp Brain Res ; 242(7): 1787-1795, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38822826

ABSTRACT

The vigilance decrement, a temporal decline in detection performance, has been observed across multiple sensory modalities. Spatial uncertainty about the location of task-relevant stimuli has been demonstrated to increase the demands of vigilance and increase the severity of the vigilance decrement when attending to visual displays. The current study investigated whether spatial uncertainty also increases the severity of the vigilance decrement and task demands when an auditory display is used. Individuals monitored an auditory display to detect critical signals that were shorter in duration than non-target stimuli. These auditory stimuli were presented in either a consistent, predictable pattern that alternated sound presentation from left to right (spatial certainty) or an inconsistent, unpredictable pattern that randomly presented sounds from the left or right (spatial uncertainty). Cerebral blood flow velocity (CBFV) was measured to assess the neurophysiological demands of the task. A decline in performance and CBFV was observed in both the spatially certain and spatially uncertain conditions, suggesting that spatial auditory vigilance tasks are demanding and can result in a vigilance decrement. Spatial uncertainty resulted in a more severe vigilance decrement in correct detections compared to spatial certainty. Reduced right-hemispheric CBFV was also observed during spatial uncertainty compared to spatial certainty. Together, these results suggest that auditory spatial uncertainty hindered performance and required greater attentional demands compared to spatial certainty. These results concur with previous research showing the negative impact of spatial uncertainty in visual vigilance tasks, but the current results contrast recent research showing no effect of spatial uncertainty on tactile vigilance.


Subject(s)
Auditory Perception , Cerebrovascular Circulation , Space Perception , Humans , Male , Female , Young Adult , Uncertainty , Adult , Auditory Perception/physiology , Cerebrovascular Circulation/physiology , Space Perception/physiology , Acoustic Stimulation/methods , Hemodynamics/physiology , Attention/physiology , Arousal/physiology , Psychomotor Performance/physiology
17.
Front Psychol ; 15: 1375717, 2024.
Article in English | MEDLINE | ID: mdl-38708020

ABSTRACT

Excessive mind wandering (MW) contributes to the development and maintenance of psychiatric disorders. Previous studies have suggested that auditory beat stimulation may represent a method enabling a reduction of MW. However, little is known about how different auditory stimulation conditions are subjectively perceived and whether this perception is in turn related to changes in subjective states, behavioral measures of attention and MW. In the present study, we therefore investigated MW under auditory beat stimulation and control conditions using experience sampling during a sustained attention to response task (SART). The subjective perception of the stimulation conditions, as well as changes in anxiety, stress and negative mood after versus before stimulation were assessed via visual-analog scales. Results showed that any auditory stimulation applied during the SART was perceived as more distracting, disturbing, uncomfortable and tiring than silence and was related to more pronounced increases of stress and negative mood. Importantly, the perception of the auditory conditions as disturbing was directly correlated with MW propensity. Additionally, distracting, disturbing and uncomfortable perceptions predicted negative mood. In turn, negative mood was inversely correlated with response accuracy for target stimuli, a behavioral indicator of MW. In summary, our data show that MW and attentional performance are affected by the adverse perception of auditory stimulation, and that this influence may be mediated by changes in mood.

18.
Cogn Emot ; : 1-13, 2024 May 07.
Article in English | MEDLINE | ID: mdl-38712807

ABSTRACT

Sustained attention, a key cognitive skill that improves during childhood and adolescence, tends to be worse in some emotional and behavioural disorders. Sustained attention is typically studied in non-affective task contexts; here, we used a novel task to index performance in affective versus neutral contexts across adolescence (N = 465; ages 11-18). We asked whether: (i) performance would be worse in negative versus neutral task contexts; (ii) performance would improve with age; (iii) affective interference would be greater in younger adolescents; (iv) adolescents at risk for depression and higher in anxiety would show overall worse performance; and (v) would show differential performance in negative contexts. Results indicated that participants performed more poorly in negative contexts and showed age-related performance improvements. Those at risk of depression performed more poorly than those at lower risk. However, there was no difference between groups as a result of affective context. For anxiety there was no difference in performance as a function of severity. However, those with higher anxiety showed less variance in their reaction times to negative stimuli than those with lower anxiety. One interpretation is that moderate levels of emotional arousal associated with anxiety make individuals less susceptible to the distracting effects of negative stimuli.

19.
J Autism Dev Disord ; 2024 May 28.
Article in English | MEDLINE | ID: mdl-38806749

ABSTRACT

In this study we aimed to assess the influence of school-based neurofeedback training on the attention of students with autism and intellectual disabilities. We assessed 24 students of a special education center who attended neurofeedback training sessions during the schoolyear; we also assessed 25 controls from the same center. We used two computer tasks to assess sustained attention in simple and cognitively demanding test situations, and we used a pen-and-paper task to assess selective attention. Each student who took part in the study was tested at the beginning and at the end of the schoolyear. Students from the experimental group significantly improved their performance in the task related to sustained attention to simple stimuli. No performance improvement related to neurofeedback treatment was observed in either sustained attention in cognitively demanding situations or selective attention. School-based neurofeedback training may improve sustained attention to simple stimuli in students with developmental disabilities.

20.
Bioengineering (Basel) ; 11(5)2024 May 08.
Article in English | MEDLINE | ID: mdl-38790334

ABSTRACT

Sustained attention is pivotal for tasks like studying and working for which focus and low distractions are necessary for peak productivity. This study explores the effectiveness of adaptive transcranial direct current stimulation (tDCS) in either the frontal or parietal region to enhance sustained attention. The research involved ten healthy university students performing the Continuous Performance Task-AX (AX-CPT) while receiving either frontal or parietal tDCS. The study comprised three phases. First, we acquired the electroencephalography (EEG) signal to identify the most suitable metrics related to attention states. Among different spectral and complexity metrics computed on 3 s epochs of EEG, the Fuzzy Entropy and Multiscale Sample Entropy Index of frontal channels were selected. Secondly, we assessed how tDCS at a fixed 1.0 mA current affects attentional performance. Finally, a real-time experiment involving continuous metric monitoring allowed personalized dynamic optimization of the current amplitude and stimulation site (frontal or parietal). The findings reveal statistically significant improvements in mean accuracy (94.04 vs. 90.82%) and reaction times (262.93 vs. 302.03 ms) with the adaptive tDCS compared to a non-stimulation condition. Average reaction times were statistically shorter during adaptive stimulation compared to a fixed current amplitude condition (262.93 vs. 283.56 ms), while mean accuracy stayed similar (94.04 vs. 93.36%, improvement not statistically significant). Despite the limited number of subjects, this work points out the promising potential of adaptive tDCS as a tailored treatment for enhancing sustained attention.

SELECTION OF CITATIONS
SEARCH DETAIL