Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Bioinspir Biomim ; 17(4)2022 06 09.
Article in English | MEDLINE | ID: mdl-35580573

ABSTRACT

Optic flow provides rich information about world-relative self-motion and is used by many animals to guide movement. For example, self-motion along linear, straight paths without eye movements, generates optic flow that radiates from a singularity that specifies the direction of travel (heading). Many neural models of optic flow processing contain heading detectors that are tuned to the position of the singularity, the design of which is informed by brain area MSTd of primate visual cortex that has been linked to heading perception. Such biologically inspired models could be useful for efficient self-motion estimation in robots, but existing systems are tailored to the limited scenario of linear self-motion and neglect sensitivity to self-motion along more natural curvilinear paths. The observer in this case experiences more complex motion patterns, the appearance of which depends on the radius of the curved path (path curvature) and the direction of gaze. Indeed, MSTd neurons have been shown to exhibit tuning to optic flow patterns other than radial expansion, a property that is rarely captured in neural models. We investigated in a computational model whether a population of MSTd-like sensors tuned to radial, spiral, ground, and other optic flow patterns could support the accurate estimation of parameters describing both linear and curvilinear self-motion. We used deep learning to decode self-motion parameters from the signals produced by the diverse population of MSTd-like units. We demonstrate that this system is capable of accurately estimating curvilinear path curvature, clockwise/counterclockwise sign, and gaze direction relative to the path tangent in both synthetic and naturalistic videos of simulated self-motion. Estimates remained stable over time while rapidly adapting to dynamic changes in the observer's curvilinear self-motion. Our results show that coupled biologically inspired and artificial neural network systems hold promise as a solution for robust vision-based self-motion estimation in robots.


Subject(s)
Motion Perception , Optic Flow , Visual Cortex , Animals , Motion , Motion Perception/physiology , Neurons/physiology , Visual Cortex/physiology
2.
Front Comput Neurosci ; 16: 844289, 2022.
Article in English | MEDLINE | ID: mdl-35431848

ABSTRACT

This paper introduces a self-tuning mechanism for capturing rapid adaptation to changing visual stimuli by a population of neurons. Building upon the principles of efficient sensory encoding, we show how neural tuning curve parameters can be continually updated to optimally encode a time-varying distribution of recently detected stimulus values. We implemented this mechanism in a neural model that produces human-like estimates of self-motion direction (i.e., heading) based on optic flow. The parameters of speed-sensitive units were dynamically tuned in accordance with efficient sensory encoding such that the network remained sensitive as the distribution of optic flow speeds varied. In two simulation experiments, we found that model performance with dynamic tuning yielded more accurate, shorter latency heading estimates compared to the model with static tuning. We conclude that dynamic efficient sensory encoding offers a plausible approach for capturing adaptation to varying visual environments in biological visual systems and neural models alike.

3.
J Vis ; 20(3): 8, 2020 03 17.
Article in English | MEDLINE | ID: mdl-32232376

ABSTRACT

Affordance-based control and current-future control offer competing theoretical accounts of the visual control of locomotion. The aim of this study was to test predictions derived from these accounts about the necessity of self-motion (Experiment 1) and target-ground contact (Experiment 2) in perceiving whether a moving target can be intercepted before it reaches an escape zone. We designed a novel interception task wherein the ability to perceive target catchability before initiating movement was advantageous. Subjects pursued a target moving through a field in a virtual environment and attempted to intercept the target before it escaped into a forest. Targets were catchable on some trials but not others. If subjects perceived that they could not reach the target, they were instructed to immediately give up by pressing a button. After each trial, subjects received a point reward that incentivized them to pursue only those targets that were catchable. On the majority of trials, subjects either pursued and successfully intercepted the target or chose not to pursue at all, demonstrating that humans are sensitive to catchability while stationary. Performance also degraded when the target was floating rather than in contact with the ground. Both findings are incompatible with the current-future account and support the affordance-based account of choosing whether to pursue moving targets.


Subject(s)
Decision Making/physiology , Motion Perception/physiology , Psychomotor Performance/physiology , Adolescent , Female , Humans , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...