Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
J Vis ; 24(5): 16, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38819806

ABSTRACT

Multistable perception occurs in all sensory modalities, and there is ongoing theoretical debate about whether there are overarching mechanisms driving multistability across modalities. Here we study whether multistable percepts are coupled across vision and audition on a moment-by-moment basis. To assess perception simultaneously for both modalities without provoking a dual-task situation, we query auditory perception by direct report, while measuring visual perception indirectly via eye movements. A support-vector-machine (SVM)-based classifier allows us to decode visual perception from the eye-tracking data on a moment-by-moment basis. For each timepoint, we compare visual percept (SVM output) and auditory percept (report) and quantify the co-occurrence of integrated (one-object) or segregated (two-objects) interpretations in the two modalities. Our results show an above-chance coupling of auditory and visual perceptual interpretations. By titrating stimulus parameters toward an approximately symmetric distribution of integrated and segregated percepts for each modality and individual, we minimize the amount of coupling expected by chance. Because of the nature of our task, we can rule out that the coupling stems from postperceptual levels (i.e., decision or response interference). Our results thus indicate moment-by-moment perceptual coupling in the resolution of visual and auditory multistability, lending support to theories that postulate joint mechanisms for multistable perception across the senses.


Subject(s)
Auditory Perception , Photic Stimulation , Visual Perception , Humans , Auditory Perception/physiology , Visual Perception/physiology , Adult , Male , Female , Photic Stimulation/methods , Young Adult , Eye Movements/physiology , Acoustic Stimulation/methods
2.
J Cult Cogn Sci ; 6(3): 251-268, 2022.
Article in English | MEDLINE | ID: mdl-35996660

ABSTRACT

This study investigated the universality of emotional prosody in perception of discrete emotions when semantics is not available. In two experiments the perception of emotional prosody in Hebrew and German by listeners who speak one of the languages but not the other was investigated. Having a parallel tool in both languages allowed to conduct controlled comparisons. In Experiment 1, 39 native German speakers with no knowledge of Hebrew and 80 native Israeli speakers rated Hebrew sentences spoken with four different emotional prosodies (anger, fear, happiness, sadness) or neutral. The Hebrew version of the Test for Rating of Emotions in Speech (T-RES) was used for this purpose. Ratings indicated participants' agreement on how much the sentence conveyed each of four discrete emotions (anger, fear, happiness and sadness). In Experient 2, 30 native speakers of German, and 24 Israeli native speakers of Hebrew who had no knowledge of German rated sentences of the German version of the T-RES. Based only on the prosody, German-speaking participants were able to accurately identify the emotions in the Hebrew sentences and Hebrew-speaking participants were able to identify the emotions in the German sentences. In both experiments ratings between the groups were similar. These findings show that individuals are able to identify emotions in a foreign language even if they do not have access to semantics. This ability goes beyond identification of target emotion; similarities between languages exist even for "wrong" perception. This adds to accumulating evidence in the literature on the universality of emotional prosody. Supplementary Information: The online version contains supplementary material available at 10.1007/s41809-022-00107-x.

3.
PLoS One ; 16(6): e0252370, 2021.
Article in English | MEDLINE | ID: mdl-34086770

ABSTRACT

In multistability, a constant stimulus induces alternating perceptual interpretations. For many forms of visual multistability, the transition from one interpretation to another ("perceptual switch") is accompanied by a dilation of the pupil. Here we ask whether the same holds for auditory multistability, specifically auditory streaming. Two tones were played in alternation, yielding four distinct interpretations: the tones can be perceived as one integrated percept (single sound source), or as segregated with either tone or both tones in the foreground. We found that the pupil dilates significantly around the time a perceptual switch is reported ("multistable condition"). When participants instead responded to actual stimulus changes that closely mimicked the multistable perceptual experience ("replay condition"), the pupil dilated more around such responses than in multistability. This still held when data were corrected for the pupil response to the stimulus change as such. Hence, active responses to an exogeneous stimulus change trigger a stronger or temporally more confined pupil dilation than responses to an endogenous perceptual switch. In another condition, participants randomly pressed the buttons used for reporting multistability. In Study 1, this "random condition" failed to sufficiently mimic the temporal pattern of multistability. By adapting the instructions, in Study 2 we obtained a response pattern more similar to the multistable condition. In this case, the pupil dilated significantly around the random button presses. Albeit numerically smaller, this pupil response was not significantly different from the multistable condition. While there are several possible explanations-related, e.g., to the decision to respond-this underlines the difficulty to isolate a purely perceptual effect in multistability. Our data extend previous findings from visual to auditory multistability. They highlight methodological challenges in interpreting such data and suggest possible approaches to meet them, including a novel stimulus to simulate the experience of perceptual switches in auditory streaming.


Subject(s)
Auditory Perception/physiology , Acoustic Stimulation/methods , Adult , Female , Humans , Male , Pupil/physiology , Sound , Visual Perception/physiology
4.
Vision Res ; 182: 69-88, 2021 05.
Article in English | MEDLINE | ID: mdl-33610002

ABSTRACT

In multistability, perceptual interpretations ("percepts") of ambiguous stimuli alternate over time. There is considerable debate as to whether similar regularities govern the first percept after stimulus onset and percepts during prolonged presentation. We address this question in a visual pattern-component rivalry paradigm by presenting two overlaid drifting gratings, which participants perceived as individual gratings passing in front of each other ("segregated") or as a plaid ("integrated"). We varied the enclosed angle ("opening angle") between the gratings (experiments 1 and 2) and stimulus orientation (experiment 2). The relative number of integrated percepts increased monotonically with opening angle. The point of equality, where half of the percepts were integrated, was at a smaller opening angle at onset than during prolonged viewing. The functional dependence of the relative number of integrated percepts on opening angle showed a steeper curve at onset than during prolonged viewing. Dominance durations of integrated percepts were longer at onset than during prolonged viewing and increased with opening angle. The general pattern persisted when stimuli were rotated (experiment 2), despite some perceptual preference for cardinal motion directions over oblique directions. Analysis of eye movements, specifically the slow phase of the optokinetic nystagmus (OKN), confirmed the veridicality of participants' reports and provided a temporal characterization of percept formation after stimulus onset. Together, our results show that the first percept after stimulus onset exhibits a different dependence on stimulus parameters than percepts during prolonged viewing. This underlines the distinct role of the first percept in multistability.


Subject(s)
Nystagmus, Optokinetic , Vision, Binocular , Humans , Photic Stimulation
5.
Acta Psychol (Amst) ; 202: 102957, 2020 Jan.
Article in English | MEDLINE | ID: mdl-31841879

ABSTRACT

We present a set of 423 animated action movie clips of 3 s, that we expect to be useful for a variety of experimental paradigms in which sentences are elicited. The clips either depict an action involving only an agent (intransitive action, e.g., a policeman that is sleeping), an action involving an agent and a patient (transitive action, e.g., a policeman shooting a pirate), or an action involving an agent, an object, and a beneficiary (ditransitive action, e.g., a policeman showing a hat to a pirate). In order to verify that the movie clips (when presented with a verb) indeed elicit intransitive, transitive, or ditransitive sentences, we conducted a written norming study with native speakers of American English. We asked 203 participants to describe the clips with a sentence using a given verb. The movie clips elicited valid responses in 90% of the cases. Moreover, there was an active response bias for the transitives, and a prepositional object dative (PO-dative) response bias for the ditransitives. This bias differed between verbs in the ditransitives. A list is provided with all clips and the proportion of each response type for each clip. The clips are stored as MP4-files and can be freely downloaded.


Subject(s)
Language , Motion Pictures , Photic Stimulation/methods , Verbal Behavior/physiology , Adult , Aged , Female , Humans , Male , Middle Aged , Random Allocation , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...