Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 33
Filter
Add more filters










Publication year range
1.
Sci Robot ; 8(82): eadg3679, 2023 09 13.
Article in English | MEDLINE | ID: mdl-37756384

ABSTRACT

For many robotics applications, it is desirable to have relatively low-power and efficient onboard solutions. We took inspiration from insects, such as ants, that are capable of learning and following routes in complex natural environments using relatively constrained sensory and neural systems. Such capabilities are particularly relevant to applications such as agricultural robotics, where visual navigation through dense vegetation remains a challenging task. In this scenario, a route is likely to have high self-similarity and be subject to changing lighting conditions and motion over uneven terrain, and the effects of wind on leaves increase the variability of the input. We used a bioinspired event camera on a terrestrial robot to collect visual sequences along routes in natural outdoor environments and applied a neural algorithm for spatiotemporal memory that is closely based on a known neural circuit in the insect brain. We show that this method is plausible to support route recognition for visual navigation and more robust than SeqSLAM when evaluated on repeated runs on the same route or routes with small lateral offsets. By encoding memory in a spiking neural network running on a neuromorphic computer, our model can evaluate visual familiarity in real time from event camera footage.


Subject(s)
Robotics , Learning , Brain , Agriculture , Algorithms
2.
Sci Adv ; 9(16): eadg2094, 2023 04 21.
Article in English | MEDLINE | ID: mdl-37083522

ABSTRACT

Quantifying the behavior of small animals traversing long distances in complex environments is one of the most difficult tracking scenarios for computer vision. Tiny and low-contrast foreground objects have to be localized in cluttered and dynamic scenes as well as trajectories compensated for camera motion and drift in multiple lengthy recordings. We introduce CATER, a novel methodology combining an unsupervised probabilistic detection mechanism with a globally optimized environment reconstruction pipeline enabling precision behavioral quantification in natural environments. Implemented as an easy to use and highly parallelized tool, we show its application to recover fine-scale motion trajectories, registered to a high-resolution image mosaic reconstruction, of naturally foraging desert ants from unconstrained field recordings. By bridging the gap between laboratory and field experiments, we gain previously unknown insights into ant navigation with respect to motivational states, previous experience, and current environments and provide an appearance-agnostic method applicable to study the behavior of a wide range of terrestrial species under realistic conditions.


Subject(s)
Ants , Environment , Animals , Vision, Ocular , Motion
3.
Bioinspir Biomim ; 18(3)2023 03 27.
Article in English | MEDLINE | ID: mdl-36881919

ABSTRACT

Many invertebrates are ideal model systems on which to base robot design principles due to their success in solving seemingly complex tasks across domains while possessing smaller nervous systems than vertebrates. Three areas are particularly relevant for robot designers: Research on flying and crawling invertebrates has inspired new materials and geometries from which robot bodies (their morphologies) can be constructed, enabling a new generation of softer, smaller, and lighter robots. Research on walking insects has informed the design of new systems for controlling robot bodies (their motion control) and adapting their motion to their environment without costly computational methods. And research combining wet and computational neuroscience with robotic validation methods has revealed the structure and function of core circuits in the insect brain responsible for the navigation and swarming capabilities (their mental faculties) displayed by foraging insects. The last decade has seen significant progress in the application of principles extracted from invertebrates, as well as the application of biomimetic robots to model and better understand how animals function. This Perspectives paper on the past 10 years of the Living Machines conference outlines some of the most exciting recent advances in each of these fields before outlining lessons gleaned and the outlook for the next decade of invertebrate robotic research.


Subject(s)
Biomimetics , Invertebrates , Models, Neurological , Robotics , Animals , Humans , Biomimetics/methods , Biomimetics/trends , Insecta/anatomy & histology , Insecta/physiology , Invertebrates/anatomy & histology , Invertebrates/physiology , Motion , Neurosciences/trends , Reproducibility of Results , Robotics/instrumentation , Robotics/methods , Robotics/trends
4.
Elife ; 112022 10 13.
Article in English | MEDLINE | ID: mdl-36226912

ABSTRACT

Revealing the functioning of compound eyes is of interest to biologists and engineers alike who wish to understand how visually complex behaviours (e.g. detection, tracking, and navigation) arise in nature, and to abstract concepts to develop novel artificial sensory systems. A key investigative method is to replicate the sensory apparatus using artificial systems, allowing for investigation of the visual information that drives animal behaviour when exposed to environmental cues. To date, 'compound eye models' (CEMs) have largely explored features such as field of view and angular resolution, but the role of shape and overall structure have been largely overlooked due to modelling complexity. Modern real-time ray-tracing technologies are enabling the construction of a new generation of computationally fast, high-fidelity CEMs. This work introduces a new open-source CEM software (CompoundRay) that is capable of accurately rendering the visual perspective of bees (6000 individual ommatidia arranged on 2 realistic eye surfaces) at over 3000 frames per second. We show how the speed and accuracy facilitated by this software can be used to investigate pressing research questions (e.g. how low resolution compound eyes can localise small objects) using modern methods (e.g. machine learning-based information exploration).


Subject(s)
Bees , Compound Eye, Arthropod , Animals , Cues
5.
Elife ; 102021 12 09.
Article in English | MEDLINE | ID: mdl-34882094

ABSTRACT

The central complex of the insect midbrain is thought to coordinate insect guidance strategies. Computational models can account for specific behaviours, but their applicability across sensory and task domains remains untested. Here, we assess the capacity of our previous model (Sun et al. 2020) of visual navigation to generalise to olfactory navigation and its coordination with other guidance in flies and ants. We show that fundamental to this capacity is the use of a biologically plausible neural copy-and-shift mechanism that ensures sensory information is presented in a format compatible with the insect steering circuit regardless of its source. Moreover, the same mechanism is shown to allow the transfer cues from unstable/egocentric to stable/geocentric frames of reference, providing a first account of the mechanism by which foraging insects robustly recover from environmental disturbances. We propose that these circuits can be flexibly repurposed by different insect navigators to address their unique ecological needs.


Subject(s)
Central Nervous System/physiology , Chemotaxis/physiology , Insecta/physiology , Movement/physiology , Spatial Navigation/physiology , Animals , Models, Theoretical
6.
Sci Rep ; 10(1): 15358, 2020 09 21.
Article in English | MEDLINE | ID: mdl-32958797

ABSTRACT

Spotted wing drosophila, Drosophila suzukii, is a serious invasive pest impacting the production of multiple fruit crops, including soft and stone fruits such as strawberries, raspberries and cherries. Effective control is challenging and reliant on integrated pest management which includes the use of an ever decreasing number of approved insecticides. New means to reduce the impact of this pest that can be integrated into control strategies are urgently required. In many production regions, including the UK, soft fruit are typically grown inside tunnels clad with polyethylene based materials. These can be modified to filter specific wavebands of light. We investigated whether targeted spectral modifications to cladding materials that disrupt insect vision could reduce the incidence of D. suzukii. We present a novel approach that starts from a neuroscientific investigation of insect sensory systems and ends with infield testing of new cladding materials inspired by the biological data. We show D. suzukii are predominantly sensitive to wavelengths below 405 nm (ultraviolet) and above 565 nm (orange & red) and that targeted blocking of lower wavebands (up to 430 nm) using light restricting materials reduces pest populations up to 73% in field trials.


Subject(s)
Crops, Agricultural/parasitology , Drosophila/growth & development , Fruit/parasitology , Insect Control/methods , Animals , Insecticides/administration & dosage , Light
7.
Elife ; 92020 06 26.
Article in English | MEDLINE | ID: mdl-32589143

ABSTRACT

Insect navigation arises from the coordinated action of concurrent guidance systems but the neural mechanisms through which each functions, and are then coordinated, remains unknown. We propose that insects require distinct strategies to retrace familiar routes (route-following) and directly return from novel to familiar terrain (homing) using different aspects of frequency encoded views that are processed in different neural pathways. We also demonstrate how the Central Complex and Mushroom Bodies regions of the insect brain may work in tandem to coordinate the directional output of different guidance cues through a contextually switched ring-attractor inspired by neural recordings. The resultant unified model of insect navigation reproduces behavioural data from a series of cue conflict experiments in realistic animal environments and offers testable hypotheses of where and how insects process visual cues, utilise the different information that they provide and coordinate their outputs to achieve the adaptive behaviours observed in the wild.


Subject(s)
Insecta/physiology , Models, Neurological , Nervous System Physiological Phenomena , Spatial Navigation/physiology , Animals , Ants/anatomy & histology , Ants/physiology , Brain/anatomy & histology , Brain/physiology , Drosophila/anatomy & histology , Drosophila/physiology , Insecta/anatomy & histology , Mushroom Bodies/anatomy & histology , Mushroom Bodies/physiology , Nervous System/anatomy & histology , Neural Pathways/anatomy & histology , Neural Pathways/physiology
8.
J Exp Biol ; 223(Pt 14)2020 07 15.
Article in English | MEDLINE | ID: mdl-32487668

ABSTRACT

Ants can navigate by comparing the currently perceived view with memorised views along a familiar foraging route. Models regarding route-following suggest that the views are stored and recalled independently of the sequence in which they occur. Hence, the ant only needs to evaluate the instantaneous familiarity of the current view to obtain a heading direction. This study investigates whether ant homing behaviour is influenced by alterations in the sequence of views experienced along a familiar route, using the frequency of stop-and-scan behaviour as an indicator of the ant's navigational uncertainty. Ants were trained to forage between their nest and a feeder which they exited through a short channel before proceeding along the homeward route. In tests, ants were collected before entering the nest and released again in the channel, which was placed either in its original location or halfway along the route. Ants exiting the familiar channel in the middle of the route would thus experience familiar views in a novel sequence. Results show that ants exiting the channel scan significantly more when they find themselves in the middle of the route, compared with when emerging at the expected location near the feeder. This behaviour suggests that previously encountered views influence the recognition of current views, even when these views are highly familiar, revealing a sequence component to route memory. How information about view sequences could be implemented in the insect brain, as well as potential alternative explanations to our results, are discussed.


Subject(s)
Ants , Homing Behavior , Orientation , Animals , Cues , Memory , Recognition, Psychology
9.
Anim Cogn ; 23(6): 1129-1141, 2020 Nov.
Article in English | MEDLINE | ID: mdl-32323027

ABSTRACT

Animals travelling through the world receive input from multiple sensory modalities that could be important for the guidance of their journeys. Given the availability of a rich array of cues, from idiothetic information to input from sky compasses and visual information through to olfactory and other cues (e.g. gustatory, magnetic, anemotactic or thermal) it is no surprise to see multimodality in most aspects of navigation. In this review, we present the current knowledge of multimodal cue use during orientation and navigation in insects. Multimodal cue use is adapted to a species' sensory ecology and shapes navigation behaviour both during the learning of environmental cues and when performing complex foraging journeys. The simultaneous use of multiple cues is beneficial because it provides redundant navigational information, and in general, multimodality increases robustness, accuracy and overall foraging success. We use examples from sensorimotor behaviours in mosquitoes and flies as well as from large scale navigation in ants, bees and insects that migrate seasonally over large distances, asking at each stage how multiple cues are combined behaviourally and what insects gain from using different modalities.


Subject(s)
Ants , Homing Behavior , Animals , Bees , Cues , Learning , Orientation
10.
Sensors (Basel) ; 20(1)2020 Jan 03.
Article in English | MEDLINE | ID: mdl-31947829

ABSTRACT

Automation of agricultural processes requires systems that can accurately detect and classify produce in real industrial environments that include variation in fruit appearance due to illumination, occlusion, seasons, weather conditions, etc. In this paper we combine a visual processing approach inspired by colour-opponent theory in humans with recent advancements in one-stage deep learning networks to accurately, rapidly and robustly detect ripe soft fruits (strawberries) in real industrial settings and using standard (RGB) camera input. The resultant system was tested on an existent data-set captured in controlled conditions as well our new real-world data-set captured on a real strawberry farm over two months. We utilise F 1 score, the harmonic mean of precision and recall, to show our system matches the state-of-the-art detection accuracy ( F 1 : 0.793 vs. 0.799) in controlled conditions; has greater generalisation and robustness to variation of spatial parameters (camera viewpoint) in the real-world data-set ( F 1 : 0.744); and at a fraction of the computational cost allowing classification at almost 30fps. We propose that the L*a*b*Fruits system addresses some of the most pressing limitations of current fruit detection systems and is well-suited to application in areas such as yield forecasting and harvesting. Beyond the target application in agriculture this work also provides a proof-of-principle whereby increased performance is achieved through analysis of the domain data, capturing features at the input level rather than simply increasing model complexity.

11.
J Neurosci Methods ; 330: 108455, 2020 01 15.
Article in English | MEDLINE | ID: mdl-31739118

ABSTRACT

BACKGROUND: Image-based tracking of individual animals can provide rich data to underpin breakthroughs in biological and medical research, but few if any existing methods extend to tracking unconstrained natural behaviour in the field. NEW METHOD: We have developed a visual tracking system for animals filmed with a freely moving hand-held or drone-operated camera in their natural environment. This exploits a global inference method for detecting motion of an animal against a cluttered background. Trajectories are then generated by a novel video key-frame selection scheme in combination with a geometrically constrained image stitching algorithm, resulting in a two-dimensional panorama image of the environment on which the dense animal path is displayed. RESULTS: By introducing a minimal and plausible set of constraints regarding the camera orientation and movement, we demonstrate that both per-frame animal positions and overall trajectories can be extracted with reasonable accuracy, for a range of different animals, environments and imaging modalities. COMPARISON: Our method requires only a single uncalibrated camera, does not require marking or training data to detect the animal, and makes no prior assumptions about appearance of the target or background. In particular it can detect targets occupying fewer than 20 pixels in the image, and deal with poor contrast, highly dynamic lighting and frequent occlusion. CONCLUSION: Our algorithm produces highly informative qualitative trajectories embedded in a panorama of the environment. The results are still subject to rotational drift and additional scaling routines would be needed to obtain absolute real-world coordinates. It nevertheless provides a flexible and easy-to-use system to obtain rich data on natural animal behaviour in the field.


Subject(s)
Algorithms , Behavior, Animal , Movement , Neurosciences , Video Recording , Animals , Behavior, Animal/physiology , Environment , Movement/physiology , Neurosciences/instrumentation , Neurosciences/methods , Video Recording/instrumentation , Video Recording/methods
12.
PLoS Comput Biol ; 15(7): e1007123, 2019 07.
Article in English | MEDLINE | ID: mdl-31318859

ABSTRACT

Many insects navigate by integrating the distances and directions travelled on an outward path, allowing direct return to the starting point. Fundamental to the reliability of this process is the use of a neural compass based on external celestial cues. Here we examine how such compass information could be reliably computed by the insect brain, given realistic constraints on the sky polarisation pattern and the insect eye sensor array. By processing the degree of polarisation in different directions for different parts of the sky, our model can directly estimate the solar azimuth and also infer the confidence of the estimate. We introduce a method to correct for tilting of the sensor array, as might be caused by travel over uneven terrain. We also show that the confidence can be used to approximate the change in sun position over time, allowing the compass to remain fixed with respect to 'true north' during long excursions. We demonstrate that the compass is robust to disturbances and can be effectively used as input to an existing neural model of insect path integration. We discuss the plausibility of our model to be mapped to known neural circuits, and to be implemented for robot navigation.


Subject(s)
Behavior, Animal/physiology , Insecta/physiology , Models, Biological , Animals , Brain/physiology , Computational Biology , Computer Simulation , Cues , Homing Behavior/physiology , Light , Models, Neurological , Optic Lobe, Nonmammalian/physiology , Orientation/physiology , Photoreceptor Cells, Invertebrate/physiology , Spatial Behavior/physiology , Sunlight
13.
Interface Focus ; 8(4): 20180010, 2018 Aug 06.
Article in English | MEDLINE | ID: mdl-29951190

ABSTRACT

Visual memory is crucial to navigation in many animals, including insects. Here, we focus on the problem of visual homing, that is, using comparison of the view at a current location with a view stored at the home location to control movement towards home by a novel shortcut. Insects show several visual specializations that appear advantageous for this task, including almost panoramic field of view and ultraviolet light sensitivity, which enhances the salience of the skyline. We discuss several proposals for subsequent processing of the image to obtain the required motion information, focusing on how each might deal with the problem of yaw rotation of the current view relative to the home view. Possible solutions include tagging of views with information from the celestial compass system, using multiple views pointing towards home, or rotation invariant encoding of the view. We illustrate briefly how a well-known shape description method from computer vision, Zernike moments, could provide a compact and rotation invariant representation of sky shapes to enhance visual homing. We discuss the biological plausibility of this solution, and also a fourth strategy, based on observed behaviour of insects, that involves transfer of information from visual memory matching to the compass system.

14.
Curr Biol ; 27(3): 401-407, 2017 Feb 06.
Article in English | MEDLINE | ID: mdl-28111152

ABSTRACT

Ants can navigate over long distances between their nest and food sites using visual cues [1, 2]. Recent studies show that this capacity is undiminished when walking backward while dragging a heavy food item [3-5]. This challenges the idea that ants use egocentric visual memories of the scene for guidance [1, 2, 6]. Can ants use their visual memories of the terrestrial cues when going backward? Our results suggest that ants do not adjust their direction of travel based on the perceived scene while going backward. Instead, they maintain a straight direction using their celestial compass. This direction can be dictated by their path integrator [5] but can also be set using terrestrial visual cues after a forward peek. If the food item is too heavy to enable body rotations, ants moving backward drop their food on occasion, rotate and walk a few steps forward, return to the food, and drag it backward in a now-corrected direction defined by terrestrial cues. Furthermore, we show that ants can maintain their direction of travel independently of their body orientation. It thus appears that egocentric retinal alignment is required for visual scene recognition, but ants can translate this acquired directional information into a holonomic frame of reference, which enables them to decouple their travel direction from their body orientation and hence navigate backward. This reveals substantial flexibility and communication between different types of navigational information: from terrestrial to celestial cues and from egocentric to holonomic directional memories. VIDEO ABSTRACT.


Subject(s)
Ants/physiology , Homing Behavior/physiology , Animals , Cues , Memory , Orientation/physiology , Space Perception , Spatial Navigation , Vision, Ocular/physiology
15.
Front Behav Neurosci ; 10: 69, 2016.
Article in English | MEDLINE | ID: mdl-27147991

ABSTRACT

Ants are known to be capable of homing to their nest after displacement to a novel location. This is widely assumed to involve some form of retinotopic matching between their current view and previously experienced views. One simple algorithm proposed to explain this behavior is continuous retinotopic alignment, in which the ant constantly adjusts its heading by rotating to minimize the pixel-wise difference of its current view from all views stored while facing the nest. However, ants with large prey items will often drag them home while facing backwards. We tested whether displaced ants (Myrmecia croslandi) dragging prey could still home despite experiencing an inverted view of their surroundings under these conditions. Ants moving backwards with food took similarly direct paths to the nest as ants moving forward without food, demonstrating that continuous retinotopic alignment is not a critical component of homing. It is possible that ants use initial or intermittent retinotopic alignment, coupled with some other direction stabilizing cue that they can utilize when moving backward. However, though most ants dragging prey would occasionally look toward the nest, we observed that their heading direction was not noticeably improved afterwards. We assume ants must use comparison of current and stored images for corrections of their path, but suggest they are either able to chose the appropriate visual memory for comparison using an additional mechanism; or can make such comparisons without retinotopic alignment.

16.
PLoS Comput Biol ; 12(2): e1004683, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26866692

ABSTRACT

Ants, like many other animals, use visual memory to follow extended routes through complex environments, but it is unknown how their small brains implement this capability. The mushroom body neuropils have been identified as a crucial memory circuit in the insect brain, but their function has mostly been explored for simple olfactory association tasks. We show that a spiking neural model of this circuit originally developed to describe fruitfly (Drosophila melanogaster) olfactory association, can also account for the ability of desert ants (Cataglyphis velox) to rapidly learn visual routes through complex natural environments. We further demonstrate that abstracting the key computational principles of this circuit, which include one-shot learning of sparse codes, enables the theoretical storage capacity of the ant mushroom body to be estimated at hundreds of independent images.


Subject(s)
Action Potentials/physiology , Ants/physiology , Memory/physiology , Models, Neurological , Mushroom Bodies/physiology , Algorithms , Animals , Computational Biology , Neurons/physiology
17.
Anal Sci ; 31(11): 1183-8, 2015.
Article in English | MEDLINE | ID: mdl-26561264

ABSTRACT

Miniaturization of gas chromatography (GC) instrumentation enables field detection of volatile organic compounds (VOCs) for chembio-applications such as clandestine human transport and disease diagnostics. We fabricated a mesoscale pulsed discharge helium ionization detector (micro-PDHID) for integrating with our previously described mini-GC hardware. Stainless steel electrodes fabricated by photochemical etching and electroforming facilitated rapid prototyping and enabled nesting of inter-electrode insulators for self-alignment of the detector core during assembly. The prototype was ∼10 cm(3) relative to >400 cm(3) of a commercial PDHID, but with a comparable time to sweep a VOC peak from the detector cell (170 ms and 127 ms, respectively). Electron trajectory modeling, gas flow rate, voltage bias, and GC outlet location were optimized for improving sensitivity. Despite 40-fold miniaturization, the micro-PDHID detected 18 ng of the human emanation, 3-methyl-2-hexenoic acid with <3-fold decrease in sensitivity relative to the commercial detector. The micro-PDHID was rugged and operated for 9 months without failure.


Subject(s)
Chromatography, Gas/instrumentation , Helium/chemistry , Caproates/analysis , Electrodes , Humans , Miniaturization , Stainless Steel
18.
Proc Biol Sci ; 282(1816): 20151484, 2015 Oct 07.
Article in English | MEDLINE | ID: mdl-26400741

ABSTRACT

In situations with redundant or competing sensory information, humans have been shown to perform cue integration, weighting different cues according to their certainty in a quantifiably optimal manner. Ants have been shown to merge the directional information available from their path integration (PI) and visual memory, but as yet it is not clear that they do so in a way that reflects the relative certainty of the cues. In this study, we manipulate the variance of the PI home vector by allowing ants (Cataglyphis velox) to run different distances and testing their directional choice when the PI vector direction is put in competition with visual memory. Ants show progressively stronger weighting of their PI direction as PI length increases. The weighting is quantitatively predicted by modelling the expected directional variance of home vectors of different lengths and assuming optimal cue integration. However, a subsequent experiment suggests ants may not actually compute an internal estimate of the PI certainty, but are using the PI home vector length as a proxy.


Subject(s)
Ants/physiology , Cues , Homing Behavior , Animals , Choice Behavior , Spatial Memory , Visual Perception
19.
Article in English | MEDLINE | ID: mdl-25895895

ABSTRACT

Desert ants are a model system for animal navigation, using visual memory to follow long routes across both sparse and cluttered environments. Most accounts of this behaviour assume retinotopic image matching, e.g. recovering heading direction by finding a minimum in the image difference function as the viewpoint rotates. But most models neglect the potential image distortion that could result from unstable head motion. We report that for ants running across a short section of natural substrate, the head pitch varies substantially: by over 20 degrees with no load; and 60 degrees when carrying a large food item. There is no evidence of head stabilisation. Using a realistic simulation of the ant's visual world, we demonstrate that this range of head pitch significantly degrades image matching. The effect of pitch variation can be ameliorated by a memory bank of densely sampled along a route so that an image sufficiently similar in pitch and location is available for comparison. However, with large pitch disturbance, inappropriate memories sampled at distant locations are often recalled and navigation along a route can be adversely affected. Ignoring images obtained at extreme pitches, or averaging images over several pitches, does not significantly improve performance.


Subject(s)
Ants/physiology , Head Movements/physiology , Homing Behavior/physiology , Orientation/physiology , Spatial Navigation , Algorithms , Animals , Computer Simulation , Desert Climate , Memory/physiology , Models, Biological , Spain , User-Computer Interface , Visual Perception
20.
J Exp Biol ; 218(Pt 6): 819-23, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25788724

ABSTRACT

Visual navigation is a critical behaviour for many animals, and it has been particularly well studied in ants. Decades of ant navigation research have uncovered many ways in which efficient navigation can be implemented in small brains. For example, ants show us how visual information can drive navigation via procedural rather than map-like instructions. Two recent behavioural observations highlight interesting adaptive ways in which ants implement visual guidance. Firstly, it has been shown that the systematic nest searches of ants can be biased by recent experience of familiar scenes. Secondly, ants have been observed to show temporary periods of confusion when asked to repeat a route segment, even if that route segment is very familiar. Taken together, these results indicate that the navigational decisions of ants take into account their recent experiences as well as the currently perceived environment.


Subject(s)
Ants/physiology , Animals , Homing Behavior , Orientation , Spatial Navigation
SELECTION OF CITATIONS
SEARCH DETAIL
...