Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Brain Topogr ; 27(6): 707-30, 2014 Nov.
Article in English | MEDLINE | ID: mdl-24722880

ABSTRACT

We process information from the world through multiple senses, and the brain must decide what information belongs together and what information should be segregated. One challenge in studying such multisensory integration is how to quantify the multisensory interactions, a challenge that is amplified by the host of methods that are now used to measure neural, behavioral, and perceptual responses. Many of the measures that have been developed to quantify multisensory integration (and which have been derived from single unit analyses), have been applied to these different measures without much consideration for the nature of the process being studied. Here, we provide a review focused on the means with which experimenters quantify multisensory processes and integration across a range of commonly used experimental methodologies. We emphasize the most commonly employed measures, including single- and multiunit responses, local field potentials, functional magnetic resonance imaging, and electroencephalography, along with behavioral measures of detection, accuracy, and response times. In each section, we will discuss the different metrics commonly used to quantify multisensory interactions, including the rationale for their use, their advantages, and the drawbacks and caveats associated with them. Also discussed are possible alternatives to the most commonly used metrics.


Subject(s)
Brain Mapping/methods , Brain/physiology , Neurons/physiology , Perception/physiology , Animals , Data Interpretation, Statistical , Electroencephalography/methods , Humans , Magnetic Resonance Imaging/methods
2.
J Acoust Soc Am ; 130(1): 1-4, 2011 Jul.
Article in English | MEDLINE | ID: mdl-21786870

ABSTRACT

The ability to obtain reliable phonetic information from a talker's face during speech perception is an important skill. However, lip-reading abilities vary considerably across individuals. There is currently a lack of normative data on lip-reading abilities in young normal-hearing listeners. This letter describes results obtained from a visual-only sentence recognition experiment using CUNY sentences and provides the mean number of words correct and the standard deviation for different sentence lengths. Additionally, the method for calculating T-scores is provided to facilitate the conversion between raw and standardized scores. This metric can be utilized by clinicians and researchers in lip-reading studies. This statistic provides a useful benchmark for determining whether an individual's lip-reading score falls within the normal range, or whether it is above or below this range.


Subject(s)
Lipreading , Speech Perception , Cues , Humans , Photic Stimulation , Recognition, Psychology , Reference Values , Task Performance and Analysis
3.
Neuroimage ; 49(4): 3308-18, 2010 Feb 15.
Article in English | MEDLINE | ID: mdl-20004723

ABSTRACT

The temporal synchrony of auditory and visual signals is known to affect the perception of an external event, yet it is unclear what neural mechanisms underlie the influence of temporal synchrony on perception. Using parametrically varied levels of stimulus asynchrony in combination with BOLD fMRI, we identified two anatomically distinct subregions of multisensory superior temporal cortex (mSTC) that showed qualitatively distinct BOLD activation patterns. A synchrony-defined subregion of mSTC (synchronous>asynchronous) responded only when auditory and visual stimuli were synchronous, whereas a bimodal subregion of mSTC (auditory>baseline and visual>baseline) showed significant activation to all presentations, but showed monotonically increasing activation with increasing levels of asynchrony. The presence of two distinct activation patterns suggests that the two subregions of mSTC may rely on different neural mechanisms to integrate audiovisual sensory signals. An additional whole-brain analysis revealed a network of regions responding more with synchronous than asynchronous speech, including right mSTC, and bilateral superior colliculus, fusiform gyrus, lateral occipital cortex, and extrastriate visual cortex. The spatial location of individual mSTC ROIs was much more variable in the left than right hemisphere, suggesting that individual differences may contribute to the right lateralization of mSTC in a group SPM. These findings suggest that bilateral mSTC is composed of distinct multisensory subregions that integrate audiovisual speech signals through qualitatively different mechanisms, and may be differentially sensitive to stimulus properties including, but not limited to, temporal synchrony.


Subject(s)
Auditory Cortex/physiology , Auditory Perception/physiology , Nerve Net/physiology , Temporal Lobe/physiology , Visual Cortex/physiology , Visual Perception/physiology , Adult , Evoked Potentials, Auditory/physiology , Evoked Potentials, Visual/physiology , Female , Humans , Magnetic Resonance Imaging/methods , Male
SELECTION OF CITATIONS
SEARCH DETAIL
...