Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
2.
Front Psychol ; 13: 915550, 2022.
Article in English | MEDLINE | ID: mdl-35910971

ABSTRACT

Emojis are universal tools that are frequently used to express people's emotional states throughout daily communications. They are often applied in various fields of research, such as consumer surveys, as indicators of users' emotional states. Further analyses of emoji interpretation among people with age are required to ensure the validity of emojis as a metric in such fields of research, thereby reducing misunderstandings. However, details regarding the effect of age on both arousal and valence, as they pertain to the interpretation of emojis, remain unclear. Therefore, in this study, we investigate the effects of the interpretation of facial emojis on the arousal-valence space among people of varying age groups. We conducted an online survey involving 2,000 participants, whereby we employed a nine-point scale to evaluate the valence and arousal levels associated with 74 facial emojis. Based on the two axes of valence and arousal among the age groups involved in this study, emojis are categorized into six similar clusters. For the two negative clusters, i.e., strongly negative and moderately negative sentiments, the group involving middle-aged participants showed significantly higher levels of arousal compared to the group involving young participants. Additionally, not all emojis classified into the aforementioned negative clusters indicate age difference. Based on these results, this study recommends using emojis with no age-related effects on the negative clusters as indices for evaluating human emotions.

3.
Sci Rep ; 12(1): 398, 2022 01 27.
Article in English | MEDLINE | ID: mdl-35087076

ABSTRACT

Emojis are frequently used by people worldwide as a tool to express one's emotional states and have recently been considered for assessment in research. However, details regarding the ways in which they correspond to human emotional states remain unidentified. Thus, this study aimed to understand how emojis are classified on the valence and arousal axes and to examine the relationship between the former and human emotional states. In an online survey involving 1082 participants, a nine-point scale was employed to evaluate the valence and arousal levels of 74 facial emojis. Results from the cluster analysis revealed these emojis to be categorized into six different clusters on the two axes of valence and arousal. Further, the one-way analysis of variance indicated that these clusters have six valence and four arousal levels. From the results, each cluster was interpreted as (1) a strong negative sentiment, (2) a moderately negative sentiment, (3) a neutral sentiment with a negative bias, (4) a neutral sentiment with a positive bias, (5) a moderately positive sentiment, and (6) a strong positive sentiment. Therefore, facial emojis were found to comprehensively express the human emotional states.


Subject(s)
Emotions , Facial Expression , Social Media , Adult , Arousal , Female , Humans , Internet , Male , Surveys and Questionnaires , Young Adult
4.
PLoS One ; 16(1): e0245000, 2021.
Article in English | MEDLINE | ID: mdl-33406157

ABSTRACT

This study investigated the effect of active hand movement on the perception of 3-D depth change. In Experiment 1, the 3-D height of an object synchronously changed with the participant's hand movement, but the 3-D height of the object was incongruent with the distance moved by the hand. The results showed no effect of active hand movement on perceived depth. This was inconsistent with the results of a previous study conducted in a similar setting with passive hand movement. It was speculated that this contradiction appeared because the conflict between the distance moved by the hand and visual depth changes were more easily detected in the active movement situation. Therefore, it was assumed that in a condition where this conflict was hard to detect, active hand movement might affect visual depth perception. To examine this hypothesis, Experiment 2 examined whether information from hand movement would resolve the ambiguity in the depth direction of a shaded visual shape. In this experiment, the distance moved by the hand could (logically) accord with either of two depth directions (concave or convex). Moreover, the discrepancy in the distances between visual and haptic perception could be ambiguous because shading cues are unreliable in estimating absolute depth. The results showed that perceived depth directions were affected by the direction of active hand movement, thus supporting the hypothesis. Based on these results, simulations based on a causal inference model were performed, and it was found that these simulations could replicate the qualitative aspects of the experimental results.


Subject(s)
Cues , Depth Perception/physiology , Movement/physiology , Visual Perception/physiology , Adult , Female , Hand , Humans , Male , Young Adult
5.
Cogn Emot ; 32(8): 1663-1670, 2018 12.
Article in English | MEDLINE | ID: mdl-29334821

ABSTRACT

The present study describes the development and validation of a facial expression database comprising five different horizontal face angles in dynamic and static presentations. The database includes twelve expression types portrayed by eight Japanese models. This database was inspired by the dimensional and categorical model of emotions: surprise, fear, sadness, anger with open mouth, anger with closed mouth, disgust with open mouth, disgust with closed mouth, excitement, happiness, relaxation, sleepiness, and neutral (static only). The expressions were validated using emotion classification and Affect Grid rating tasks [Russell, Weiss, & Mendelsohn, 1989. Affect Grid: A single-item scale of pleasure and arousal. Journal of Personality and Social Psychology, 57(3), 493-502]. The results indicate that most of the expressions were recognised as the intended emotions and could systematically represent affective valence and arousal. Furthermore, face angle and facial motion information influenced emotion classification and valence and arousal ratings. Our database will be available online at the following URL. https://www.dh.aist.go.jp/database/face2017/ .


Subject(s)
Emotions/physiology , Facial Expression , Adult , Anger , Arousal , Databases, Factual , Face , Fear , Female , Happiness , Humans , Japan , Male , Personality , Pleasure , Reproducibility of Results , Students/psychology , Young Adult
6.
Front Psychol ; 8: 314, 2017.
Article in English | MEDLINE | ID: mdl-28326051

ABSTRACT

The effect of perceived causality on other aspects of perception, such as temporal or spatial perception, has interested many researchers. Previous studies have shown that the perceived timing of two events is modulated when the events are intentionally produced or the causal link between the two events was known in advance. However, little research has directly supported the idea that causality alone can modulate the perceived timing of two events without having knowledge about causal links in advance. In this study, I used novel causal displays in which various types of causal contexts could be presented in subsequent events (movement or color change of objects). In these displays, the preceding events were the same (ball falling from above), so observers could not predict which subsequent events displayed. The results showed that the perceived causal context modulated the temporal relationship of two serial events so as to be consistent with the causal order implied by the subsequent event; ball hit the floor, then objects moved. These modulations were smaller when the movements implied preceding effect of the falling ball (e.g., wind pressure). These results are well-suited to the Bayesian framework in which the perceived timing of events is reconstructed through the observers' prior experiences, and suggest that multiple prior experiences would competitively contribute to the estimation of the timing of events.

7.
Front Psychol ; 6: 1302, 2015.
Article in English | MEDLINE | ID: mdl-26388807

ABSTRACT

The Simon Effect is a phenomenon in which reaction times are usually faster when the stimulus location and the response correspond, even if the stimulus location is irrelevant to the task. Recent studies have demonstrated the Simon effect in a three-dimensional (3-D) display. The present study examined whether two-dimensional (2-D) and 3-D locations simultaneously affected the Simon effect for stimuli in which a target and fixation were located on the same plane (ground or ceiling) at different 3-D depths, and the perspective effect produced a difference in the 2-D vertical location of the target stimulus relative to the fixation. The presence of the ground and ceiling plane was controlled to examine the contextual effects of background. The results showed that the 2-D vertical location and 3-D depth simultaneously affected the speed of responses, and they did not interact. The presence of the background did not affect the magnitude of either the 2-D or the 3-D Simon effect. These results suggest that 2-D vertical location and 3-D depth are coded simultaneously and independently, and both affect response selection in which 2-D and 3-D representations overlap.

8.
PLoS One ; 9(9): e106633, 2014.
Article in English | MEDLINE | ID: mdl-25180594

ABSTRACT

This study examined effects of hand movement on visual perception of 3-D movement. I used an apparatus in which a cursor position in a simulated 3-D space and the position of a stylus on a haptic device could coincide using a mirror. In three experiments, participants touched the center of a rectangle in the visual display with the stylus of the force-feedback device. Then the rectangle's surface stereoscopically either protruded toward a participant or indented away from the participant. Simultaneously, the stylus either pushed back participant's hand, pulled away, or remained static. Visual and haptic information were independently manipulated. Participants judged whether the rectangle visually protruded or dented. Results showed that when the hand was pulled away, subjects were biased to perceive rectangles indented; however, when the hand was pushed back, no effect of haptic information was observed (Experiment 1). This effect persisted even when the cursor position was spatially separated from the hand position (Experiment 2). But, when participants touched an object different from the visual stimulus, this effect disappeared (Experiment 3). These results suggest that the visual system tried to integrate the dynamic visual and haptic information when they coincided cognitively, and the effect of haptic information on visually perceived depth was direction-dependent.


Subject(s)
Depth Perception/physiology , Movement , Proprioception , Adult , Female , Hand/physiology , Humans , Male , Middle Aged , Probability , Young Adult
9.
Vision Res ; 49(8): 834-42, 2009 Mar.
Article in English | MEDLINE | ID: mdl-19285522

ABSTRACT

We investigated whether the visual system could use a novel action-perception relationship mediated by a touch panel to resolve ambiguity in 2D optic flows. The stimulus was an optic flow produced by a dotted plane, which was translated and rotated in depth. The translation was synchronized with subject's hand movements on a touch panel. There were two perceptual interpretations of the stimulus as a surface patch oriented in 3D: (1) approaching it in depth and rotating away from gaze normal, or (2) not translating it in depth and rotating toward gaze normal around an axis perpendicular to that of Case 1 [Wexler, M., Lamouret, I., & Droulez, J. (2001a). The stationarity hypothesis: an allocentric criterion in visual perception. Vision Research, 41, 3023-3037]. Subjects reported the direction of the axis of rotation, which was perceptually coupled with the perception of translation in depth. The results indicate that the frequency of perception in Case 1 increased as the sessions progressed. This suggests that the visual system learned the association between hand movements and viewpoint translation during the experiment and used this association to decompose the optic flow.


Subject(s)
Hand/physiology , Motion Perception/physiology , Movement/physiology , Psychomotor Performance/physiology , Adult , Female , Humans , Male , Photic Stimulation/methods , Young Adult
10.
Perception ; 37(11): 1649-66, 2008.
Article in English | MEDLINE | ID: mdl-19189730

ABSTRACT

The perceived temporal order of external successive events does not always follow their physical temporal order. We examined the contribution of self-motion mechanisms in the perception of temporal order in the auditory modality. We measured perceptual biases in the judgment of the temporal order of two short sounds presented successively, while participants experienced visually induced self-motion (yaw-axis circular vection) elicited by viewing long-lasting large-field visual motion. In experiment 1, a pair of white-noise patterns was presented to participants at various stimulus-onset asynchronies through headphones, while they experienced visually induced self-motion. Perceived temporal order of auditory events was modulated by the direction of the visual motion (or self-motion). Specifically, the sound presented to the ear in the direction opposite to the visual motion (ie heading direction) was perceived prior to the sound presented to the ear in the same direction. Experiments 2A and 2B were designed to reduce the contributions of decisional and/or response processes. In experiment 2A, the directional cueing of the background (left or right) and the response dimension (high pitch or low pitch) were not spatially associated. In experiment 2B, participants were additionally asked to report which of the two sounds was perceived 'second'. Almost the same results as in experiment 1 were observed, suggesting that the change in temporal order of auditory events during large-field visual motion reflects a change in perceptual processing. Experiment 3 showed that the biases in the temporal-order judgments of auditory events were caused by concurrent actual self-motion with a rotatory chair. In experiment 4, using a small display, we showed that 'pure' long exposure to visual motion without the sensation of self-motion was not responsible for this phenomenon. These results are consistent with previous studies reporting a change in the perceived temporal order of visual or tactile events depending on the direction of self-motion. Hence, large-field induced (ie optic flow) self-motion can affect the temporal order of successive external events across various modalities.


Subject(s)
Auditory Perception/physiology , Judgment/physiology , Motion Perception/physiology , Acoustic Stimulation/methods , Adult , Cues , Female , Humans , Male , Photic Stimulation/methods , Psychophysics , Time Perception/physiology , User-Computer Interface , Young Adult
11.
Perception ; 36(8): 1229-43, 2007.
Article in English | MEDLINE | ID: mdl-17972485

ABSTRACT

We examined whether the position of objects in external space affects the visual-search task associated with the tilt of 3-D objects. An array of cube-like objects was stereoscopically displayed at a distance of 4.5 m on a large screen 1.5 m above or below eye height. Subjects were required to detect a downward-tilted target among upward-tilted distractors or an upward-tilted target among downward-tilted distractors. When the stimuli consisted of shaded parallelepipeds whose upper/bottom faces were lighter than their side faces, the upward-tilted target was detected faster. This result was in accordance with the 'top-view assumption' reported in previous research. Displaying stimuli in the upper position degraded overall performance. However, when the shaded objects whose upper/bottom faces were darker than their side faces were displayed, the detection of a downward-tilted target became as efficient as that of an upward-tilted target only at the upper position. These results indicate that it is possible for the spatial position of the stimulus to promote the detection of a downward-tilted target when shading and perspective information are consistent with the viewing direction.


Subject(s)
Attention , Depth Perception/physiology , Discrimination, Psychological , Eye Movements/physiology , Form Perception/physiology , Adult , Analysis of Variance , Humans , Psychophysics , Reaction Time
12.
J Neuroeng Rehabil ; 4: 36, 2007 Sep 29.
Article in English | MEDLINE | ID: mdl-17903267

ABSTRACT

OBJECTIVE: We studied the effects of the presentation of a visual sign that warned subjects of acceleration around the yaw and pitch axes in virtual reality (VR) on their heart rate variability. METHODS: Synchronization of the immersive virtual reality equipment (CAVE) and motion base system generated a driving scene and provided subjects with dynamic and wide-ranging depth information and vestibular input. The heart rate variability of 21 subjects was measured while the subjects observed a simulated driving scene for 16 minutes under three different conditions. RESULTS: When the predictive sign of the acceleration appeared 3500 ms before the acceleration, the index of the activity of the autonomic nervous system (low/high frequency ratio; LF/HF ratio) of subjects did not change much, whereas when no sign appeared the LF/HF ratio increased over the observation time. When the predictive sign of the acceleration appeared 750 ms before the acceleration, no systematic change occurred. CONCLUSION: The visual sign which informed subjects of the acceleration affected the activity of the autonomic nervous system when it appeared long enough before the acceleration. Also, our results showed the importance of the interval between the sign and the event and the relationship between the gradual representation of events and their quantity.


Subject(s)
Acceleration , Heart Rate/physiology , Motion Perception/physiology , Movement/physiology , Reaction Time/physiology , Reflex, Vestibulo-Ocular/physiology , User-Computer Interface , Adult , Evidence-Based Medicine , Female , Humans , Male , Pilot Projects
SELECTION OF CITATIONS
SEARCH DETAIL
...