Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 25
Filter
Add more filters










Publication year range
1.
Sci Rep ; 12(1): 9089, 2022 06 14.
Article in English | MEDLINE | ID: mdl-35701462

ABSTRACT

We investigated the emotion perception process based on hospitality expertise. Forty subjects were divided into the OMOTENASHI group working at inns considered to represent the spirit of hospitality, OMOTENASHI in Japan, and CONTROL group without experience in the hospitality industry. We presented neutral, happy, and angry faces to investigate P100 and N170 by these faces, and psychophysical changes by the favor rating test to evaluate emotional perception. In the favor rating test, the score was significantly smaller (less favorable) in OMOTENASHI than in CONTROL. Regarding event-related potential components, the maximum amplitude of P100 was significantly larger for a neutral face at the right occipital electrode in OMOTENASHI than in CONTROL, and it was significantly larger for an angry face at both occipital electrodes in OMOTENASHI than in CONTROL. However, the peak latency and maximum amplitude of N170 were not significantly different between OMOTENASHI and CONTROL at both temporal electrodes for each emotion condition. Differences on the favor rating test and P100 in OMOTENASHI suggested that workers at inns may more quickly notice and be more sensitive to the facial emotion of guests due to hospitality training, and/or that hospitality expertise may increase attention to emotion by top-down and/or bottom-up processing.


Subject(s)
Facial Expression , Facial Recognition , Electroencephalography , Emotions , Evoked Potentials , Humans , Japan , Perception
2.
Front Physiol ; 13: 803274, 2022.
Article in English | MEDLINE | ID: mdl-35431988

ABSTRACT

The face has a large amount of information that is useful for humans in social communication. Recently, non-invasive methods have been used to investigate human brain activity related to perception and cognition processes. Electroencephalography (EEG) and magnetoencephalography (MEG) have excellent temporal resolution and reasonably good spatial resolution. Therefore, they are useful to investigate time sequences of human brain activity related to the face perception process. In this review, we introduce our previous EEG and MEG studies of human face perception that demonstrated the following characteristics of face perception processing: (1) Event-related components in the temporal area related to the activity in the inferior temporal (IT) area, corresponding to the fusiform face area (FFA), are evoked approximately 180 msec after the presentation of a face. The activity in the IT area plays an important role in the detection processing of a face, and the contours of a face affect the activity in the IT areas. (2) Event-related components in the temporal area related to the superior temporal sulcus (STS) activity are larger when eyes are averted than when directly looking into the eyes. (3) The direction of features of a face affects the face perception processing in the right hemisphere. On the other hand, the matching of the direction between the contours and features of a face affects the processing in the left hemisphere. (4) Random dots blinking (RDB), which uses temporal changes in patterns of many small dots to present stimuli without a change in luminance during the presentation of a face, is a useful visual stimulus method to investigate the brain activity related to face perception processing in the IT area using EEG and MEG.

3.
Front Hum Neurosci ; 9: 263, 2015.
Article in English | MEDLINE | ID: mdl-26082700

ABSTRACT

The main objectives of this study were to investigate the development of face perception in Japanese children, focusing on the changes in face processing strategies (holistic and/or configural vs. feature-based) that occur during childhood. To achieve this, we analyzed the face-related N170 component, evoked by upright face, inverted face, and eyes stimuli in 82 Japanese children aged between 8- and 13-years-old. During the experiment, the children were asked to perform a target detection task in which they were told to press a button when they saw images of faces or kettles with mustaches, glasses, and fake noses; i.e., an implicit face perception task. The N170 signals observed after the presentation of the upright face stimuli were longer in duration and/or had at least two peaks in the 8-11-year-old children, whereas those seen in the 12-13-year-old children were sharp and only had a single peak. N170 latency was significantly longer after the presentation of the eyes stimuli than after the presentation of the upright face stimuli in the 10- and 12-year-old children. In addition, significant differences in N170 latency were observed among all three stimulus types in the 13-year-old children. N170 amplitude was significantly greater after the presentation of the eyes stimuli than after the presentation of the upright face stimuli in the 8-10- and 12-year-old children. The results of the present study indicate that the upright face stimuli were processed using holistic and/or configural processing by the 13-year-old children.

4.
Front Hum Neurosci ; 8: 550, 2014.
Article in English | MEDLINE | ID: mdl-25120453

ABSTRACT

In this review, we introduced our three studies that focused on facial movements. In the first study, we examined the temporal characteristics of neural responses elicited by viewing mouth movements, and assessed differences between the responses to mouth opening and closing movements and an averting eyes condition. Our results showed that the occipitotemporal area, the human MT/V5 homologue, was active in the perception of both mouth and eye motions. Viewing mouth and eye movements did not elicit significantly different activity in the occipitotemporal area, which indicated that perception of the movement of facial parts may be processed in the same manner, and this is different from motion in general. In the second study, we investigated whether early activity in the occipitotemporal region evoked by eye movements was influenced by the facial contour and/or features such as the mouth. Our results revealed specific information processing for eye movements in the occipitotemporal region, and this activity was significantly influenced by whether movements appeared with the facial contour and/or features, in other words, whether the eyes moved, even if the movement itself was the same. In the third study, we examined the effects of inverting the facial contour (hair and chin) and features (eyes, nose, and mouth) on processing for static and dynamic face perception. Our results showed the following: (1) In static face perception, activity in the right fusiform area was affected more by the inversion of features while that in the left fusiform area was affected more by a disruption in the spatial relationship between the contour and features; and (2) In dynamic face perception, activity in the right occipitotemporal area was affected by the inversion of the facial contour.

5.
Brain Dev ; 35(4): 323-30, 2013 Apr.
Article in English | MEDLINE | ID: mdl-22677570

ABSTRACT

OBJECTIVE: In order to evaluate whether face perception is intact or not in Williams syndrome (WS), the face inversion effects (FIE) in the event-related potential (ERP) or magnetoencephalography (MEG) were investigated in three teenaged patients with WS. METHODS: Responses to the inverted faces and upright faces were compared using MEG for one 13year old girl with WS (subject A) and ERP for boys with WS at 16 and 14years of age (subjects B and C, respectively). RESULTS: Although age-matched control children showed FIE in both MEG and ERP studies, two subjects (A and B) with WS showed no FIE at all. The neurophysiological data of ERP in subject B was significantly different from those of the age-matched controls. On the other hand, a boy with WS (subject C) showed typical FIE in the same manner as the age-matched controls. CONCLUSIONS: The difference between those with or without FIE was not explained merely by the chronological age, a simple delay in mental age or in the ability to discriminate among upright faces. The absence of FIE may be related to the severity of a deficit in the dorsal pathway function that is characteristic to the syndrome.


Subject(s)
Evoked Potentials/physiology , Face , Pattern Recognition, Visual/physiology , Perceptual Disorders/etiology , Williams Syndrome/complications , Adolescent , Child , Electroencephalography , Female , Humans , Longitudinal Studies , Magnetic Resonance Imaging , Magnetoencephalography , Male , Photic Stimulation , Reaction Time
6.
Brain Nerve ; 64(7): 727-35, 2012 Jul.
Article in Japanese | MEDLINE | ID: mdl-22764344

ABSTRACT

In this review article, we summarize our results from magnetoencephalography (MEG) and electroencephalography (EEG) studies on face perception. The primary results were as follows: (1) facial (eye and mouth) movements are processed differently from general motion perception, but eye and mouth movements are likely processed in the same manner. (2) In a study investigating the interaction between auditory and visual stimuli relating to vowel sounds in the auditory cortex, vowel sound perception in the auditory cortex, at least in the primary processing stage, was not affected by simultaneously viewing mouth movements. (3) In a study investigating the effects of face contour and features on early occipitotemporal activity when viewing eye movement, there was evidence of specific information processing for eye movements in the occipitotemporal region, and this activity was significantly influenced by whether the movements appeared with the face contour and/or features. (4) In a study investigating the effects of inverting facial contour (hair and chin) and features (eyes, nose and mouth) on the processing of static and dynamic face perception, activity in the right fusiform area was more affected by the inversion of features, whereas activity in the left fusiform area was more affected by disruption of the spatial relationship between the contour and features in static face perception, and activity in the right occipitotemporal area was most affected by inversion of the facial contour in dynamic face perception. (5) In a study investigating the perception of changes in facial emotion, the areas of the brain involved in perceiving changes in facial emotion were found to have not matured by 14 years of age.


Subject(s)
Brain Mapping , Brain/physiology , Face , Form Perception/physiology , Pattern Recognition, Visual/physiology , Humans , Magnetoencephalography
7.
Brain Res ; 1383: 230-41, 2011 Apr 06.
Article in English | MEDLINE | ID: mdl-21295020

ABSTRACT

We investigated the effects of inverting facial contour (hair and chin) and features (eyes, nose and mouth) on processing for static and dynamic face perception using magnetoencephalography (MEG). We used apparent motion, in which the first stimulus (S1) was replaced by a second stimulus (S2) with no interstimulus interval and subjects perceived visual motion, and presented three conditions as follows: (1) U&U: Upright contour and Upright features, (2) U&I: Upright contour and Inverted features, and (3) I&I: Inverted contour and Inverted features. In static face perception (S1 onset), the peak latency of the fusiform area's activity, which was related to static face perception, was significantly longer for U&I and I&I than for U&U in the right hemisphere and for U&I than for U&U and I&I in the left. In dynamic face perception (S2 onset), the strength (moment) of the occipitotemporal area's activity, which was related to dynamic face perception, was significantly larger for I&I than for U&U and U&I in the right hemisphere, but not the left. These results can be summarized as follows: (1) in static face perception, the activity of the right fusiform area was more affected by the inversion of features while that of the left fusiform area was more affected by the disruption of the spatial relation between the contour and features, and (2) in dynamic face perception, the activity of the right occipitotemporal area was affected by the inversion of the facial contour.


Subject(s)
Brain Mapping , Brain/physiology , Face , Form Perception/physiology , Pattern Recognition, Visual/physiology , Adult , Female , Humans , Magnetoencephalography , Male , Middle Aged , Photic Stimulation , Signal Processing, Computer-Assisted , Young Adult
8.
Clin Neurophysiol ; 122(3): 530-538, 2011 Mar.
Article in English | MEDLINE | ID: mdl-20724212

ABSTRACT

OBJECTIVE: The development of the perception of changes in facial emotion was investigated using event-related potentials (ERPs) in children and adults. METHODS: Four different conditions were presented: (1) N-H: a neutral face that suddenly changed to a happy face. (2) H-N: reverse of N-H. (3) N-A: a neutral face that suddenly changed to an angry face. (4) A-N: reverse of N-A. RESULTS: In the bilateral posterior temporal areas, a negative component was evoked by all conditions in younger children (7-10 years old), older children (11-14 years old), and adults (23-33 years old) within 150-300 ms. Peak latency was significantly shorter and amplitude was significantly smaller in adults than younger and older children. Moreover, maximum amplitude was significantly larger for N-H and N-A than H-N and A-N in younger children and for N-H than the other three conditions in adults. CONCLUSION: The areas of the brain involved in perceiving changes in facial emotion have not matured by 14 years of age. SIGNIFICANCE: Our study is the first to clarify a difference between children and adults in the perception of facial emotional change.


Subject(s)
Electroencephalography , Emotions/physiology , Evoked Potentials/physiology , Facial Expression , Social Perception , Adolescent , Adult , Aging/physiology , Analysis of Variance , Brain/growth & development , Brain/physiology , Child , Data Interpretation, Statistical , Female , Humans , Male , Photic Stimulation , Temporal Lobe/physiology , Young Adult
9.
Exp Brain Res ; 194(4): 597-604, 2009 Apr.
Article in English | MEDLINE | ID: mdl-19252904

ABSTRACT

We developed a visual 3D model of a space module and analyzed whether activity in the auditory cortex is influenced by rotating the image using magnetoencephalography. We presented 1,000 Hz pure tone as an auditory stimulus in four different visual conditions: (1) RR: a virtual image rotated around the center, (2) VR: images rotated vertically, (3) HR: images rotated horizontally and (4) ST: the images did not rotate. We compared the difference in the auditory evoked component among the conditions. The dipoles were estimated to lie in Heschl's gyrus. The dipole moment was significantly larger for RR and VR than for ST in the right hemisphere. Investigating the inter-hemispheric differences in each visual condition, the dipole moments for RR and VR were significantly larger in the right hemisphere than the left hemisphere. Auditory activity was influenced by visual movement inducing self-motion perception and the effect of such visual movement on the auditory cortex was right-dominant.


Subject(s)
Auditory Cortex/physiology , Evoked Potentials, Auditory/physiology , Motion Perception/physiology , Acoustic Stimulation , Adult , Analysis of Variance , Female , Functional Laterality , Humans , Magnetic Resonance Imaging , Magnetoencephalography , Male , Middle Aged , Photic Stimulation , Rotation , User-Computer Interface , Young Adult
10.
Exp Brain Res ; 193(2): 255-65, 2009 Feb.
Article in English | MEDLINE | ID: mdl-19002677

ABSTRACT

Using random dots blinking (RDB), which reflects the activity of the higher visual area related to face perception, the following stimuli were presented. (1) Upright: a schematic face; (2) Inverted: the Upright stimulus inverted; and (3) Scrambled: the same contour and features as in Upright but with the spatial relation distorted. Clear negative components (N-ERP250) were identified at approximately 250 ms after stimulus onset. At the T5 and T6 electrodes, the peak latency was significantly longer for Inverted and Scrambled than for Upright. At the P4 electrode, the maximum amplitude was significantly larger for Scrambled than for Upright and Inverted. These results indicate that the delayed latency for Inverted and Scrambled reflects the involvement of the additional analytic processing caused by the configural distortion, and that the increase in amplitude for Scrambled indicates the existence of further processing caused by the distortion of the spatial relationship between the contour and features.


Subject(s)
Brain/physiology , Face , Pattern Recognition, Visual , Visual Perception , Adult , Analysis of Variance , Brain Mapping , Electroencephalography , Evoked Potentials , Female , Humans , Magnetic Resonance Imaging , Male , Photic Stimulation , Recognition, Psychology
11.
Brain Res Bull ; 77(5): 264-73, 2008 Nov 25.
Article in English | MEDLINE | ID: mdl-18793705

ABSTRACT

To understand the processing of facial expressions in terms of social communication, it is important to clarify how they are influenced by environmental stimuli such as natural scenes or objects. We investigated how and when neural responses to facial expressions were modulated by both a natural scene and an object containing emotional information. A facial expression stimulus (fearful/neutral) was presented after a scene or object stimulus (fearful/neutral), and then event-related potentials were recorded from both the onset of scene and facial expression presentation. As in previous studies, for the presentation of the scenes and objects positive-going waves at around 200-500 ms were observed for unpleasant visual stimuli at the Pz and Cz electrodes when the stimuli were intact; however, such a response was not observed when the stimuli were scrambled. During the subsequent facial expression presentation period, although we could not identify a significant interaction between the contextual information and facial expression in the N170 component, we observed a significant interaction in the P2 component: the P2 amplitude of the fearful cued was significantly larger than that of the neutral cued condition when the face was fearful, and the P2 amplitude of the neutral face was significantly larger than that of the fearful face condition when the preceding stimulus was neutral. These findings show that an adjacent, non-face stimulus containing emotional information influences the subsequent processing of facial expressions up to 260 ms, and even in cases when the two stimulus categories are different.


Subject(s)
Emotions/physiology , Evoked Potentials/physiology , Facial Expression , Photic Stimulation/methods , Visual Perception/physiology , Electroencephalography , Fear/psychology , Female , Humans , Male
12.
Brain Topogr ; 20(1): 31-9, 2007.
Article in English | MEDLINE | ID: mdl-17638065

ABSTRACT

We recorded event-related potentials (ERPs) to investigate the interhemispheric difference of the N170 component for upright and inverted face perception in detail in fifteen healthy subjects. This is the first ERP study focusing on interhemispheric differences for face perception by showing faces in the hemifield. The face inversion effect, the prolonged latency and enhanced amplitude were found in both hemispheres. We found that the peak latency of the N170 following both upright and inverted face stimulation showed no significant difference between each hemisphere, though the N170 latency for the inverted face in the left hemisphere was shorter than that in the right hemisphere. The N170 recorded from the hemisphere ipsilateral to the stimulated hemifield showed unique findings. The interhemispheric time difference of the N170 between the right and the left hemispheres when the inverted face was presented in the left hemifield was significantly shorter than in the other three conditions. This unique finding may indicate that the conduction time from the right to the left for inverted face perception is faster than the other conditions, or that the left hemisphere specifically processed the inverted face very rapidly after receiving signals from the right hemisphere. If the N170 was generated by some, at least two, temporally overlapping activities, the different style of a summation of these activities may cause the unique findings found in this study. In conclusion, by presenting face stimuli in the hemifields, we could identify several new findings regarding the N170 component related to the face inversion effect.


Subject(s)
Face , Functional Laterality/physiology , Visual Perception/physiology , Adult , Data Interpretation, Statistical , Electroencephalography , Female , Humans , Male , Middle Aged , Photic Stimulation
13.
Neuroimage ; 35(4): 1624-35, 2007 May 01.
Article in English | MEDLINE | ID: mdl-17363279

ABSTRACT

We investigated whether early activity in the occipitotemporal region, corresponding to human MT/V5, is influenced by a face contour and/or features such as the mouth using magnetoencephalography (MEG). We used apparent motion as visual stimuli and compared four conditions as follows: (1) CDL: A schematic face consisting of a face Contour, two Dots and a horizontal Line (2) CD: The Contour and two Dots (3) DL: Two Dots and a horizontal Line and (4) D: Two Dots only. Subjects described a simple movement of dots for D, but eye movement for CDL, DL and CD, though movement modalities were the same through all conditions. We used a single equivalent current dipole (ECD) model within 145-220 ms after stimulus onset and estimated the location, dipole moment (strength) and peak latency. There were no significant differences in the peak latency of the estimated dipoles between each condition, but the activity was significantly stronger for CDL than for CD (p<0.05), DL (p<0.01), and D (p<0.01) in the right hemisphere, and DL and D (p<0.01) in the left. These results indicated that there is specific information processing for eye movements in the occipitotemporal region, the human MT/V5 homologue, and this activity was significantly influenced by whether movements appeared with the face contour and/or features, in other words, whether the eyes moved or not, even if the movement itself was the same.


Subject(s)
Eye Movements/physiology , Face/anatomy & histology , Occipital Lobe/physiology , Temporal Lobe/physiology , Adult , Data Interpretation, Statistical , Female , Functional Laterality/physiology , Humans , Magnetic Resonance Imaging , Magnetoencephalography , Male , Middle Aged , Photic Stimulation
15.
Brain Res ; 1092(1): 152-60, 2006 May 30.
Article in English | MEDLINE | ID: mdl-16684514

ABSTRACT

The present study used magnetoencephalography (MEG) to investigate human MT/V5 activity when observing changes in eye gaze. Subjects viewed a face in which the eyes changed to look either directly at (BACK) or away from (AWAY) the subject in a series of apparent motion conditions. BACK involved 2 directions, from left to center (LC) and from right to center (RC). Likewise, AWAY involved 2 directions, from center to left (CL) and from center to right (CR). A clear MEG component, 1M, was elicited with all eye gaze changes. Mean peak latency was 157 ms and was unaffected by stimulus condition. The equivalent current dipole (ECD) was localized to human MT/V5. Two main effects were noted: (1) ECD moment was significantly larger for BACK than for AWAY; and (2) 1M ECD locations were more posterior for AWAY than for BACK. Gaze direction, with LEFT involving CL and RC and RIGHT involving CR and LC, showed no significant effects. These data indicate that MT/V5 responds to gaze direction rather than eye position, and that eye movements directed at the viewer elicit the strongest effects. Processing of gaze change is NOT sensitive to eye direction per se but rather is modulated by eye gaze relative to the viewer.


Subject(s)
Eye Movements/physiology , Motion Perception/physiology , Pattern Recognition, Visual/physiology , Temporal Lobe/physiology , Visual Cortex/physiology , Adult , Attention/physiology , Brain Mapping , Evoked Potentials/physiology , Face/physiology , Female , Fixation, Ocular/physiology , Humans , Magnetoencephalography , Male , Photic Stimulation , Reaction Time/physiology , Social Behavior , Temporal Lobe/anatomy & histology , Visual Cortex/anatomy & histology , Visual Pathways/anatomy & histology , Visual Pathways/physiology
16.
Neurosci Lett ; 402(1-2): 57-61, 2006 Jul 10.
Article in English | MEDLINE | ID: mdl-16635547

ABSTRACT

We developed a new method of color-opponent flicker (COF) stimulation, and investigated behavioral responses for object discrimination at the around threshold frequency of COF stimulation. Pairs of figures, a face, flower, the letter "G" and a random pattern, were drawn with a red and green checkerboard with a black mesh. COF stimulation was produced by presenting pairs of figures alternately (red-green-red-green-) at various frequencies (30-120 Hz). A discrimination task for objects during COF stimulation was performed by 16 healthy subjects. Threshold frequency of COF stimulation was between 50 and 75 Hz. The accuracy rate for face discrimination was significantly higher than those for other objects (p<0.01, ANOVA with Fisher's PLSD multiple comparisons test). The present COF stimulation technique could be useful to investigate subliminal processes of the visual system, and the present results indicate a higher sensitivity and selectivity for face discrimination than those for other objects.


Subject(s)
Color Perception/physiology , Discrimination, Psychological/physiology , Face , Flicker Fusion/physiology , Pattern Recognition, Visual/physiology , Subliminal Stimulation , Adult , Analysis of Variance , Female , Humans , Male , Photic Stimulation/methods , Sensory Thresholds/physiology
17.
Hum Brain Mapp ; 27(10): 811-8, 2006 Oct.
Article in English | MEDLINE | ID: mdl-16511887

ABSTRACT

We investigated the effects of subliminal stimulation on visible stimulation to demonstrate the priority of facial discrimination processing, using a unique, indiscernible, color-opponent subliminal (COS) stimulation. We recorded event-related magnetic cortical fields (ERF) by magnetoencephalography (MEG) after the presentation of a face or flower stimulus with COS conditioning using a face, flower, random pattern, and blank. The COS stimulation enhanced the response to visible stimulation when the figure in the COS stimulation was identical to the target visible stimulus, but more so for the face than for the flower stimulus. The ERF component modulated by the COS stimulation was estimated to be located in the ventral temporal cortex. We speculated that the enhancement was caused by an interaction of the responses after subthreshold stimulation by the COS stimulation and the suprathreshold stimulation after target stimulation, such as in the processing for categorization or discrimination. We also speculated that the face was processed with priority at the level of the ventral temporal cortex during visual processing outside of consciousness.


Subject(s)
Brain Mapping , Pattern Recognition, Visual/physiology , Subliminal Stimulation , Temporal Lobe/physiology , Adult , Color , Facial Expression , Female , Humans , Magnetoencephalography , Male , Photic Stimulation
18.
Neuroimage ; 30(1): 239-44, 2006 Mar.
Article in English | MEDLINE | ID: mdl-16310379

ABSTRACT

To clarify the latency of the earliest cortical activity in visual processing, electroretinograms (ERGs) and visual evoked magnetic fields (VEFs) following flash stimulation were recorded simultaneously in six human subjects. Flash stimuli were applied to the right eye and ERGs were recorded from a skin electrode placed on the lower lid. ERGs showed two major deflections in all subjects: an eyelid-negativity around 20 ms and a positivity around 60 ms corresponding to an a- and b-waves, respectively. The mean onset and peak latency of the earliest component of VEFs (37 M) was 30.2 and 36.9 ms, respectively. There was a linear correlation between the peak latency of the a-wave and the onset latency of the 37 M (r=0.90, P=0.011). When a single equivalent current dipole analysis was applied to the 37 M, four out of six subjects showed highly reliable results. The generator of the 37 M was estimated to be located in the striate cortex in all four subjects. Since post-receptoral activities in the retina are expected to start around the peak of the a-wave (20 ms), the early cortical activity, which appears 10 ms later than the a-wave peak, is considered to be the earliest cortical activity following flash stimulation.


Subject(s)
Electroencephalography/statistics & numerical data , Evoked Potentials, Visual/physiology , Magnetoencephalography/statistics & numerical data , Signal Processing, Computer-Assisted , Visual Cortex/physiology , Visual Perception/physiology , Adult , Brain Mapping , Dominance, Cerebral/physiology , Electroretinography/statistics & numerical data , Humans , Male , Photic Stimulation , Reaction Time/physiology , Reference Values , Statistics as Topic , Time Factors
19.
Cereb Cortex ; 16(1): 18-30, 2006 Jan.
Article in English | MEDLINE | ID: mdl-15800024

ABSTRACT

Although anatomical, histochemical and electrophysiological findings in both animals and humans have suggested a parallel and serial mode of auditory processing, precise activation timings of each cortical area are not well known, especially in humans. We investigated the timing of arrival of signals to multiple cortical areas using magnetoencephalography in humans. Following click stimuli applied to the left ear, activations were found in six cortical areas in the right hemisphere: the posteromedial part of Heschl's gyrus (HG) corresponding to the primary auditory cortex (PAC), the anterolateral part of the HG region on or posterior to the transverse sulcus, the posterior parietal cortex (PPC), posterior and anterior parts of the superior temporal gyrus (STG), and the planum temporale (PT). The mean onset latencies of each cortical activity were 17.1, 21.2, 25.3, 26.2, 30.9 and 47.6 ms respectively. These results suggested a serial model of auditory processing along the medio-lateral axis of the supratemporal plane and, in addition, implied the existence of several parallel streams running postero-superiorly (from the PAC to the belt region and then to the posterior STG, PPC or PT) and anteriorly (PAC-belt-anterior STG).


Subject(s)
Auditory Cortex/physiology , Brain Mapping/methods , Evoked Potentials, Auditory/physiology , Magnetoencephalography/methods , Nerve Net/physiology , Reaction Time/physiology , Adult , Auditory Pathways/physiology , Female , Humans , Male , Middle Aged
20.
Neuropathology ; 25(1): 8-20, 2005 Mar.
Article in English | MEDLINE | ID: mdl-15822814

ABSTRACT

We have been studying the underlying mechanisms of face perception in humans using magneto- (MEG) and electro-encephalography (EEG) including (1) perception by viewing the static face, (2) differences in perception by viewing the eyes and whole face, (3) the face inversion effect, (4) the effect of gaze direction, (5) perception of eye motion, (6) perception of mouth motion, and (7) the interaction between auditory and visual stimuli related to the vowel sounds. In this review article, we mainly summarize our results obtained on 3, 5, and 6 above. With the presentation of both upright and inverted unfamiliar faces, the inferior temporal cortex (IT) centered on the fusiform gyrus, and the lateral temporal cortex (LT) near the superior temporal sulcus were activated simultaneously, but independently, between 140 and 200 ms post-stimulus. The right hemisphere IT and LT were both active in all subjects, and those in the left hemisphere in half of the subjects. Latencies with inverted faces relative to those with upright faces were longer in the right hemisphere, and shorter in the left hemisphere. Since the activated regions under upright and those under inverted face stimuli did not show a significant difference, we consider that differences in processing upright versus inverted faces are attributable to temporal processing differences rather than to processing of information by different brain regions. When viewing the motion of the mouth and eyes, a large clear MEG component, 1M (mean peak latency of approximately 160 ms), was elicited to both mouth and eye movement, and was generated mainly in the occipito-temporal border, at human MT/V5. The 1M to mouth movement and the 1M to eye movement showed no significant difference in amplitude or generator location. Therefore, our results indicate that human MT/V5 is active in the perception of both mouth and eye motion, and that the perception of movement of facial parts is probably processed similarly.


Subject(s)
Brain Mapping , Face , Visual Perception/physiology , Adult , Electroencephalography , Facial Expression , Female , Humans , Magnetoencephalography , Male , Photic Stimulation
SELECTION OF CITATIONS
SEARCH DETAIL
...