Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
Add more filters










Publication year range
1.
Eur J Neurosci ; 59(9): 2373-2390, 2024 May.
Article in English | MEDLINE | ID: mdl-38303554

ABSTRACT

Humans have the remarkable ability to integrate information from different senses, which greatly facilitates the detection, localization and identification of events in the environment. About 466 million people worldwide suffer from hearing loss. Yet, the impact of hearing loss on how the senses work together is rarely investigated. Here, we investigate how a common sensory impairment, asymmetric conductive hearing loss (AHL), alters the way our senses interact by examining human orienting behaviour with normal hearing (NH) and acute AHL. This type of hearing loss disrupts auditory localization. We hypothesized that this creates a conflict between auditory and visual spatial estimates and alters how auditory and visual inputs are integrated to facilitate multisensory spatial perception. We analysed the spatial and temporal properties of saccades to auditory, visual and audiovisual stimuli before and after plugging the right ear of participants. Both spatial and temporal aspects of multisensory integration were affected by AHL. Compared with NH, AHL caused participants to make slow, inaccurate and unprecise saccades towards auditory targets. Surprisingly, increased weight on visual input resulted in accurate audiovisual localization with AHL. This came at a cost: saccade latencies for audiovisual targets increased significantly. The larger the auditory localization errors, the less participants were able to benefit from audiovisual integration in terms of saccade latency. Our results indicate that observers immediately change sensory weights to effectively deal with acute AHL and preserve audiovisual accuracy in a way that cannot be fully explained by statistical models of optimal cue integration.


Subject(s)
Sound Localization , Visual Perception , Humans , Female , Adult , Male , Visual Perception/physiology , Sound Localization/physiology , Young Adult , Saccades/physiology , Auditory Perception/physiology , Hearing Loss/physiopathology , Photic Stimulation/methods , Acoustic Stimulation/methods , Space Perception/physiology
2.
Neuroimage ; 286: 120515, 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-38216105

ABSTRACT

Many sensory brain areas are organized as topographic maps where neural response preferences change gradually across the cortical surface. Within association cortices, 7-Tesla fMRI and neural model-based analyses have also revealed many topographic maps for quantities like numerosity and event timing, often in similar locations. Numerical and temporal quantity estimations also show behavioral similarities and even interactions. For example, the duration of high-numerosity displays is perceived as longer than that of low-numerosity displays. Such interactions are often ascribed to a generalized magnitude system with shared neural responses across quantities. Anterior quantity responses are more closely linked to behavior. Here, we investigate whether common quantity representations hierarchically emerge by asking whether numerosity and timing maps become increasingly closely related in their overlap, response preferences, and topography. While the earliest quantity maps do not overlap, more superior maps overlap increasingly. In these overlapping areas, some intraparietal maps have consistently correlated numerosity and timing preferences, and some maps have consistent angles between the topographic progressions of numerosity and timing preferences. However, neither of these relationships increases hierarchically like the amount of overlap does. Therefore, responses to different quantities are initially derived separately, then progressively brought together, without generally becoming a common representation. Bringing together distinct responses to different quantities may underlie behavioral interactions and allow shared access to comparison and action planning systems.


Subject(s)
Brain Mapping , Brain , Humans , Photic Stimulation , Magnetic Resonance Imaging , Cerebral Cortex
3.
Nat Commun ; 13(1): 3952, 2022 07 08.
Article in English | MEDLINE | ID: mdl-35804026

ABSTRACT

Quantifying the timing (duration and frequency) of brief visual events is vital to human perception, multisensory integration and action planning. Tuned neural responses to visual event timing have been found in association cortices, in areas implicated in these processes. Here we ask how these timing-tuned responses are related to the responses of early visual cortex, which monotonically increase with event duration and frequency. Using 7-Tesla functional magnetic resonance imaging and neural model-based analyses, we find a gradual transition from monotonically increasing to timing-tuned neural responses beginning in the medial temporal area (MT/V5). Therefore, across successive stages of visual processing, timing-tuned response components gradually become dominant over inherent sensory response modulation by event timing. This additional timing-tuned response component is independent of retinotopic location. We propose that this hierarchical emergence of timing-tuned responses from sensory processing areas quantifies sensory event timing while abstracting temporal representations from spatial properties of their inputs.


Subject(s)
Visual Cortex , Visual Perception , Brain Mapping , Cerebral Cortex/physiology , Humans , Magnetic Resonance Imaging , Photic Stimulation , Sensation , Visual Cortex/diagnostic imaging , Visual Cortex/physiology , Visual Perception/physiology
4.
Neuroimage ; 258: 119366, 2022 09.
Article in English | MEDLINE | ID: mdl-35690255

ABSTRACT

Perception of sub-second auditory event timing supports multisensory integration, and speech and music perception and production. Neural populations tuned for the timing (duration and rate) of visual events were recently described in several human extrastriate visual areas. Here we ask whether the brain also contains neural populations tuned for auditory event timing, and whether these are shared with visual timing. Using 7T fMRI, we measured responses to white noise bursts of changing duration and rate. We analyzed these responses using neural response models describing different parametric relationships between event timing and neural response amplitude. This revealed auditory timing-tuned responses in the primary auditory cortex, and auditory association areas of the belt, parabelt and premotor cortex. While these areas also showed tonotopic tuning for auditory pitch, pitch and timing preferences were not consistently correlated. Auditory timing-tuned response functions differed between these areas, though without clear hierarchical integration of responses. The similarity of auditory and visual timing tuned responses, together with the lack of overlap between the areas showing these responses for each modality, suggests modality-specific responses to event timing are computed similarly but from different sensory inputs, and then transformed differently to suit the needs of each modality.


Subject(s)
Auditory Cortex , Music , Acoustic Stimulation , Auditory Cortex/physiology , Auditory Perception/physiology , Brain Mapping , Humans , Magnetic Resonance Imaging
5.
Hear Res ; 417: 108468, 2022 04.
Article in English | MEDLINE | ID: mdl-35220107

ABSTRACT

The distance of sound sources relative to the body can be estimated using acoustic level and direct-to-reverberant ratio cues. However, the ability to do this may differ for sounds that are in front compared to behind the listener. One reason for this is that vision, which plays an important role in calibrating auditory distance cues early in life, is unavailable for rear space. Furthermore, the filtering of sounds by the pinnae differs if they originate from the front compared to the back. We investigated auditory distance discrimination in front and rear space by comparing performance for auditory spatial bisection of distance and minimum audible distance discrimination (MADD) tasks. In the bisection task, participants heard three successive bursts of noise at three different distances and indicated whether the second sound (probe) was closer in space to the first or third sound (references). In the MADD task, participants reported which of two successive sounds was closer. An analysis of variance with factors task and region of space showed worse performance for rear than for front space, but no significant interaction between task and region of space. For the bisection task, the point of subjective equality (PSE) was slightly biased towards the body, but the absolute magnitude of the PSE did not differ between front and rear space. These results are consistent with the hypothesis that visual information is important in calibrating the auditory representation of front space in distance early in life.


Subject(s)
Distance Perception , Sound Localization , Acoustic Stimulation , Auditory Perception , Cues , Humans , Sound , Space Perception
6.
Sci Rep ; 11(1): 707, 2021 01 12.
Article in English | MEDLINE | ID: mdl-33436889

ABSTRACT

Pupillometry has received increased interest for its usefulness in measuring various sensory processes as an alternative to behavioural assessments. This is also apparent for multisensory investigations. Studies of the multisensory pupil response, however, have produced conflicting results. Some studies observed super-additive multisensory pupil responses, indicative of multisensory integration (MSI). Others observed additive multisensory pupil responses even though reaction time (RT) measures were indicative of MSI. Therefore, in the present study, we investigated the nature of the multisensory pupil response by combining methodological approaches of previous studies while using supra-threshold stimuli only. In two experiments we presented auditory and visual stimuli to observers that evoked a(n) (onset) response (be it constriction or dilation) in a simple detection task and a change detection task. In both experiments, the RT data indicated MSI as shown by race model inequality violation. Still, the multisensory pupil response in both experiments could best be explained by linear summation of the unisensory pupil responses. We conclude that the multisensory pupil response for supra-threshold stimuli is additive in nature and cannot be used as a measure of MSI, as only a departure from additivity can unequivocally demonstrate an interaction between the senses.


Subject(s)
Acoustic Stimulation/methods , Auditory Perception/physiology , Evoked Potentials , Photic Stimulation/methods , Pupil/physiology , Reaction Time/physiology , Visual Perception/physiology , Adult , Female , Humans , Male , Sensation , Young Adult
7.
J Vis ; 20(9): 8, 2020 09 02.
Article in English | MEDLINE | ID: mdl-32915955

ABSTRACT

Whenever we move our eyes, some visual information obtained before a saccade is combined with the visual information obtained after a saccade. Interestingly, saccades rarely land exactly on the saccade target, which may pose a problem for transsaccadic perception as it could affect the quality of postsaccadic input. Recently, however, we showed that transsaccadic feature integration is actually unaffected by deviations of saccade landing points. Possibly, transsaccadic integration remains unaffected because the presaccadic shift of attention follows the intended saccade target and not the actual saccade landing point during regular saccades. Here, we investigated whether saccade landing point errors can in fact alter transsaccadic perception when the presaccadic shift of attention follows the saccade landing point deviation. Given that saccadic adaptation not only changes the saccade vector, but also the presaccadic shift of attention, we combined a feature report paradigm with saccadic adaptation. Observers reported the color of the saccade target, which occasionally changed slightly during a saccade to the target. This task was performed before and after saccadic adaptation. The results showed that, after adaptation, presaccadic color information became less precise and transsaccadic perception had a stronger reliance on the postsaccadic color estimate. Therefore, although previous studies have shown that transsaccadic perception is generally unaffected by saccade landing point deviations, our results reveal that this cannot be considered a general property of the visual system. When presaccadic shifts of attention follow altered saccade landing points, transsaccadic perception is affected, suggesting that transsaccadic feature perception might be dependent on visual spatial attention.


Subject(s)
Adaptation, Physiological/physiology , Saccades/physiology , Spatial Processing/physiology , Adult , Color Perception , Female , Humans , Male , Photic Stimulation
8.
Curr Biol ; 30(13): R775-R778, 2020 07 06.
Article in English | MEDLINE | ID: mdl-32634421

ABSTRACT

The motion of a translating sound source is easily perceived, yet clear evidence of motion mechanisms in auditory cortex has proved elusive. A new study may explain why, revealing auditory motion is encoded in a motion-specialised region of visual cortex.


Subject(s)
Auditory Cortex , Motion Perception , Acoustic Stimulation , Auditory Perception , Humans , Motion
9.
Eur J Neurosci ; 51(5): 1137-1150, 2020 03.
Article in English | MEDLINE | ID: mdl-28973789

ABSTRACT

To date, most of the research on spatial attention has focused on probing people's responses to stimuli presented in frontal space. That is, few researchers have attempted to assess what happens in the space that is currently unseen (essentially rear space). In a sense, then, 'out of sight' is, very much, 'out of mind'. In this review, we highlight what is presently known about the perception and processing of sensory stimuli (focusing on sounds) whose source is not currently visible. We briefly summarize known differences in the localizability of sounds presented from different locations in 3D space, and discuss the consequences for the crossmodal attentional and multisensory perceptual interactions taking place in various regions of space. The latest research now clearly shows that the kinds of crossmodal interactions that take place in rear space are very often different in kind from those that have been documented in frontal space. Developing a better understanding of how people respond to unseen sound sources in naturalistic environments by integrating findings emerging from multiple fields of research will likely lead to the design of better warning signals in the future. This review highlights the need for neuroscientists interested in spatial attention to spend more time researching what happens (in terms of the covert and overt crossmodal orienting of attention) in rear space.


Subject(s)
Sound , Space Perception , Auditory Perception , Humans
10.
J Cogn Neurosci ; 31(6): 885-899, 2019 06.
Article in English | MEDLINE | ID: mdl-30883294

ABSTRACT

The integration of information from multiple senses leads to a plethora of behavioral benefits, most predominantly to faster and better detection, localization, and identification of events in the environment. Although previous studies of multisensory integration (MSI) in humans have provided insights into the neural underpinnings of MSI, studies of MSI at a behavioral level in individuals with brain damage are scarce. Here, a well-known psychophysical paradigm (the redundant target paradigm) was employed to quantify MSI in a group of stroke patients. The relation between MSI and lesion location was analyzed using lesion subtraction analysis. Twenty-one patients with ischemic infarctions and 14 healthy control participants responded to auditory, visual, and audiovisual targets in the left and right visual hemifield. Responses to audiovisual targets were faster than to unisensory targets. This could be due to MSI or statistical facilitation. Comparing the audiovisual RTs to the winner of a race between unisensory signals allowed us to determine whether participants could integrate auditory and visual information. The results indicated that (1) 33% of the patients showed an impairment in MSI; (2) patients with MSI impairment had left hemisphere and brainstem/cerebellar lesions; and (3) the left caudate, left pallidum, left putamen, left thalamus, left insula, left postcentral and precentral gyrus, left central opercular cortex, left amygdala, and left OFC were more often damaged in patients with MSI impairments. These results are the first to demonstrate the impact of brain damage on MSI in stroke patients using a well-established psychophysical paradigm.


Subject(s)
Auditory Perception/physiology , Brain Ischemia/physiopathology , Functional Laterality/physiology , Perceptual Disorders/physiopathology , Reaction Time/physiology , Stroke/pathology , Stroke/physiopathology , Visual Perception/physiology , Aged , Brain Ischemia/complications , Female , Humans , Magnetic Resonance Imaging , Male , Middle Aged , Perceptual Disorders/etiology , Stroke/complications , Stroke/diagnostic imaging , Visual Fields/physiology
11.
PLoS One ; 13(8): e0202414, 2018.
Article in English | MEDLINE | ID: mdl-30125311

ABSTRACT

The retinal location of visual information changes each time we move our eyes. Although it is now known that visual information is remapped in retinotopic coordinates across eye-movements (saccades), it is currently unclear how head-centered auditory information is remapped across saccades. Keeping track of the location of a sound source in retinotopic coordinates requires a rapid multi-modal reference frame transformation when making saccades. To reveal this reference frame transformation, we designed an experiment where participants attended an auditory or visual cue and executed a saccade. After the saccade had landed, an auditory or visual target could be presented either at the prior retinotopic location or at an uncued location. We observed that both auditory and visual targets presented at prior retinotopic locations were reacted to faster than targets at other locations. In a second experiment, we observed that spatial attention pointers obtained via audition are available in retinotopic coordinates immediately after an eye-movement is made. In a third experiment, we found evidence for an asymmetric cross-modal facilitation of information that is presented at the retinotopic location. In line with prior single cell recording studies, this study provides the first behavioral evidence for immediate auditory and cross-modal transsaccadic updating of spatial attention. These results indicate that our brain has efficient solutions for solving the challenges in localizing sensory input that arise in a dynamic context.


Subject(s)
Attention/physiology , Eye Movements/physiology , Retina/physiology , Sound Localization/physiology , Adult , Female , Humans , Male
12.
J Vis ; 18(7): 6, 2018 07 02.
Article in English | MEDLINE | ID: mdl-30029270

ABSTRACT

The experience of our visual surroundings appears continuous, contradicting the erratic nature of visual processing due to saccades. A possible way the visual system can construct a continuous experience is by integrating presaccadic and postsaccadic visual input. However, saccades rarely land exactly at the intended location. Feature integration would therefore need to be robust against variations in saccade execution to facilitate visual continuity. In the current study, observers reported a feature (color) of the saccade target, which occasionally changed slightly during the saccade. In transsaccadic change-trials, observers reported a mixture of the pre- and postsaccadic color, indicating transsaccadic feature integration. Saccade landing distance was not a significant predictor of the reported color. Next, to investigate the influence of more extreme deviations of saccade landing point on color reports, we used a global effect paradigm in a second experiment. In global effect trials, a distractor appeared together with the saccade target, causing most saccades to land in between the saccade target and the distractor. Strikingly, even when saccades land further away (up to 4°) from the saccade target than one would expect under single target conditions, there was no effect of saccade landing point on the reported color. We reason that saccade landing point does not affect feature integration, due to dissociation between the intended saccade target and the actual saccade landing point. Transsaccadic feature integration seems to be a mechanism that is dependent on visual spatial attention, and, as a result, is robust against variance in saccade landing point.


Subject(s)
Color Perception/physiology , Saccades/physiology , Vision, Ocular/physiology , Analysis of Variance , Attention/physiology , Bias , Female , Humans , Male , Young Adult
13.
Exp Brain Res ; 236(7): 1939-1951, 2018 Jul.
Article in English | MEDLINE | ID: mdl-29700577

ABSTRACT

The integration of information across sensory modalities is dependent on the spatiotemporal characteristics of the stimuli that are paired. Despite large variation in the distance over which events occur in our environment, relatively little is known regarding how stimulus-observer distance affects multisensory integration. Prior work has suggested that exteroceptive stimuli are integrated over larger temporal intervals in near relative to far space, and that larger multisensory facilitations are evident in far relative to near space. Here, we sought to examine the interrelationship between these previously established distance-related features of multisensory processing. Participants performed an audiovisual simultaneity judgment and redundant target task in near and far space, while audiovisual stimuli were presented at a range of temporal delays (i.e., stimulus onset asynchronies). In line with the previous findings, temporal acuity was poorer in near relative to far space. Furthermore, reaction time to asynchronously presented audiovisual targets suggested a temporal window for fast detection-a range of stimuli asynchronies that was also larger in near as compared to far space. However, the range of reaction times over which multisensory response enhancement was observed was limited to a restricted range of relatively small (i.e., 150 ms) asynchronies, and did not differ significantly between near and far space. Furthermore, for synchronous presentations, these distance-related (i.e., near vs. far) modulations in temporal acuity and multisensory gain correlated negatively at an individual subject level. Thus, the findings support the conclusion that multisensory temporal binding and gain are asymmetrically modulated as a function of distance from the observer, and specifies that this relationship is specific for temporally synchronous audiovisual stimulus presentations.


Subject(s)
Auditory Perception/physiology , Distance Perception/physiology , Judgment/physiology , Visual Perception/physiology , Acoustic Stimulation , Adolescent , Adult , Analysis of Variance , Correlation of Data , Female , Humans , Male , Photic Stimulation , Psychophysics , Reaction Time , Time Factors , Young Adult
14.
J Vis ; 17(6): 15, 2017 06 01.
Article in English | MEDLINE | ID: mdl-28654960

ABSTRACT

To facilitate visual continuity across eye movements, the visual system must presaccadically acquire information about the future foveal image. Previous studies have indicated that visual working memory (VWM) affects saccade execution. However, the reverse relation, the effect of saccade execution on VWM load is less clear. To investigate the causal link between saccade execution and VWM, we combined a VWM task and a saccade task. Participants were instructed to remember one, two, or three shapes and performed either a No Saccade-, a Single Saccade- or a Dual (corrective) Saccade-task. The results indicate that items stored in VWM are reported less accurately if a single saccade-or a dual saccade-task is performed next to retaining items in VWM. Importantly, the loss of response accuracy for items retained in VWM by performing a saccade was similar to committing an extra item to VWM. In a second experiment, we observed no cost of executing a saccade for auditory working memory performance, indicating that executing a saccade exclusively taxes the VWM system. Our results suggest that the visual system presaccadically stores the upcoming retinal image, which has a similar VWM load as committing one extra item to memory and interferes with stored VWM content. After the saccade, the visual system can retrieve this item from VWM to evaluate saccade accuracy. Our results support the idea that VWM is a system which is directly linked to saccade execution and promotes visual continuity across saccades.


Subject(s)
Memory, Short-Term/physiology , Saccades/physiology , Adolescent , Adult , Eye Movements/physiology , Female , Humans , Male , Mental Recall , Young Adult
15.
Exp Brain Res ; 235(8): 2511-2522, 2017 08.
Article in English | MEDLINE | ID: mdl-28528459

ABSTRACT

Since the discovery of neural regions in the monkey brain that respond preferentially to multisensory stimuli presented in proximal space, researchers have been studying this specialised spatial representation in humans. It has been demonstrated that approaching auditory or visual stimuli modulate tactile processing, while they are within the peripersonal space (PPS). The aim of the current study is to investigate the additional effects of tactile expectation on the PPS-related multisensory interactions. Based on the output of a computational simulation, we expected that as tactile expectation increases rapidly during the course of the motion of the visual stimulus, the outcome RT curves would mask the multisensory contribution of PPS. When the tactile expectation remains constant during the motion, the PPS-related spatially selective multisensory processes become apparent. The behavioural results on human experiments followed the pattern predicted by the simulation. That is, rapidly changing levels of tactile expectation, caused by dynamic visual stimuli, masks the outcome of the multisensory processes within peripersonal space. This indicates that both PPS-related multisensory interactions and tactile expectations play an important role in anticipating and responding to interactions with the body.


Subject(s)
Intention , Personal Space , Space Perception/physiology , Touch/physiology , Adolescent , Adult , Computer Simulation , Cues , Female , Humans , Male , Models, Psychological , Photic Stimulation , Physical Stimulation , Psychomotor Performance , Reaction Time/physiology , Statistics, Nonparametric , Young Adult
16.
Atten Percept Psychophys ; 79(1): 138-153, 2017 Jan.
Article in English | MEDLINE | ID: mdl-27743259

ABSTRACT

One of the factors contributing to a seamless visual experience is object correspondence-that is, the integration of pre- and postsaccadic visual object information into one representation. Previous research had suggested that before the execution of a saccade, a target object is loaded into visual working memory and subsequently is used to locate the target object after the saccade. Until now, studies on object correspondence have not taken previous fixations into account. In the present study, we investigated the influence of previously fixated information on object correspondence. To this end, we adapted a gaze correction paradigm in which a saccade was executed toward either a previously fixated or a novel target. During the saccade, the stimuli were displaced such that the participant's gaze landed between the target stimulus and a distractor. Participants then executed a corrective saccade to the target. The results indicated that these corrective saccades had lower latencies toward previously fixated than toward nonfixated targets, indicating object-specific facilitation. In two follow-up experiments, we showed that presaccadic spatial and object (surface feature) information can contribute separately to the execution of a corrective saccade, as well as in conjunction. Whereas the execution of a corrective saccade to a previously fixated target object at a previously fixated location is slowed down (i.e., inhibition of return), corrective saccades toward either a previously fixated target object or a previously fixated location are facilitated. We concluded that corrective saccades are executed on the basis of object files rather than of unintegrated feature information.


Subject(s)
Fixation, Ocular/physiology , Memory, Short-Term/physiology , Saccades/physiology , Visual Perception/physiology , Adult , Female , Humans , Male , Young Adult
17.
Vision Res ; 123: 46-55, 2016 06.
Article in English | MEDLINE | ID: mdl-27164053

ABSTRACT

When executing an eye movement to a target location, the presence of an irrelevant distracting stimulus can influence the saccade metrics and latency. The present study investigated the influence of distractors of different sensory modalities (i.e. auditory, visual and audiovisual) which were presented at various distances (i.e. close or remote) from a visual target. The interfering effects of a bimodal distractor were more pronounced in the spatial domain than in the temporal domain. The results indicate that the direction of interference depended on the spatial layout of the visual scene. The close bimodal distractor caused the saccade endpoint and saccade trajectory to deviate towards the distractor whereas the remote bimodal distractor caused a deviation away from the distractor. Furthermore, saccade averaging and trajectory deviation evoked by a bimodal distractor was larger compared to the effects evoked by a unimodal distractor. This indicates that a bimodal distractor evoked stronger spatial oculomotor competition compared to a unimodal distractor and that the direction of the interference depended on the distance between the target and the distractor. Together, these findings suggest that the oculomotor vector to irrelevant bimodal input is enhanced and that the interference by multisensory input is stronger compared to unisensory input.


Subject(s)
Attention/physiology , Saccades/physiology , Visual Perception/physiology , Adult , Analysis of Variance , Female , Humans , Male , Photic Stimulation/methods , Reaction Time/physiology , Young Adult
18.
Neuroreport ; 25(17): 1381-5, 2014 Dec 03.
Article in English | MEDLINE | ID: mdl-25340562

ABSTRACT

Neglect is a heterogeneous disorder and may be specific for only peripersonal or extrapersonal space. Behavioural consequences at the level of independency in basic activities of daily living differ between patients with peripersonal and those with extrapersonal neglect. One of the most important factors that determine independency in basic activities of daily living is balance. Here, potential differences in postural imbalance between patients with peripersonal and those with extrapersonal neglect were investigated. A total of 81 stroke patients were screened within the first 2 weeks after admission to the rehabilitation centre. Mediolateral (horizontal) and anteroposterior (vertical) displacements of the centre of pressure (CoP) were measured using a Nintendo Wii Balance Board in an eyes-open and eyes-closed condition. Patients with peripersonal neglect showed a significant displacement of mediolateral CoP from the ideal CoP, but not in the anteroposterior dimension or postural sway. Patients with extrapersonal neglect did not differ from the no-neglect patients in terms of displacement of both mediolateral and anteroposterior CoP and postural sway. There were no differences between the eyes-open and eyes-closed conditions in any of the groups. Consequences of region-specific neglect on postural imbalance appear to be very specific and cannot be accounted for by neglecting visual information only. The current findings might directly reflect a relation between body perception and body representation and (actions in) peripersonal space. When diagnosing neglect, it is relevant to distinguish the type of region-specific neglect and, when needed, to adjust the rehabilitation programme accordingly.


Subject(s)
Perceptual Disorders/physiopathology , Postural Balance , Activities of Daily Living , Adolescent , Adult , Aged , Aged, 80 and over , Biomechanical Phenomena , Body Image , Female , Humans , Male , Middle Aged , Perceptual Disorders/etiology , Stroke/complications , Stroke/physiopathology , Vision, Ocular , Young Adult
19.
Psychon Bull Rev ; 21(3): 708-14, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24101573

ABSTRACT

The aim of the present study was to investigate exogenous crossmodal orienting of attention in three-dimensional (3-D) space. Most studies in which the orienting of attention has been examined in 3-D space concerned either exogenous intramodal or endogenous crossmodal attention. Evidence for exogenous crossmodal orienting of attention in depth is lacking. Endogenous and exogenous attention are behaviorally different, suggesting that they are two different mechanisms. We used the orthogonal spatial-cueing paradigm and presented auditory exogenous cues at one of four possible locations in near or far space before the onset of a visual target. Cues could be presented at the same (valid) or at a different (invalid) depth from the target (radial validity), and on the same (valid) or on a different (invalid) side (horizontal validity), whereas we blocked the depth at which visual targets were presented. Next to an overall validity effect (valid RTs < invalid RTs) in horizontal space, we observed an interaction between the horizontal and radial validity of the cue: The horizontal validity effect was present only when the cue and the target were presented at the same depth. No horizontal validity effect was observed when the cue and the target were presented at different depths. These results suggest that exogenous crossmodal attention is "depth-aware," and they are discussed in the context of the supramodal hypothesis of attention.


Subject(s)
Attention/physiology , Auditory Perception/physiology , Depth Perception/physiology , Adult , Cues , Female , Humans , Male , Young Adult
20.
J Clin Exp Neuropsychol ; 35(8): 799-811, 2013.
Article in English | MEDLINE | ID: mdl-23984973

ABSTRACT

Visuospatial neglect has been observed in the horizontal (left/right) and vertical (up/down) dimension and depends on the distance at which a task is presented (near/far). Previously, studies have mainly focused on investigating the overall severity of neglect in near and far space in a group of neglect patients instead of examining subgroups of neglect patients with different types of distance-specific neglect. We investigated the spatial specificity (near vs. far space), frequency, and severity of neglect in the horizontal and vertical dimensions in a large group of stroke patients. We used three tasks to assess neglect in near (30 cm) and far (120 cm) space: a shape cancellation, letter cancellation, and a line bisection task. Patients were divided into four groups based on their performance: a group without neglect (N-F-), a near only neglect (N+F-), a far only neglect (N-F+), and a near and far neglect group (N+F+). About 40% of our sample showed neglect. Depending on the task, N+F- was observed in 8 to 22% of the sample, whereas N-F+ varied between 8% and 11%, and N+F+ varied between 11% to 14% of the sample. The current findings indicate that horizontal and vertical biases in performance can be confined to one region of space and are task dependent. We recommend testing for far space neglect during neuropsychological assessments in clinical practice, because this cannot be diagnosed using standard paper-and-pencil tasks.


Subject(s)
Functional Laterality/physiology , Perceptual Disorders/etiology , Space Perception/physiology , Stroke/complications , Female , Humans , Male , Middle Aged , Neuropsychological Tests , Perceptual Disorders/physiopathology , Stroke/physiopathology
SELECTION OF CITATIONS
SEARCH DETAIL
...