Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Sensors (Basel) ; 20(18)2020 Sep 14.
Article in English | MEDLINE | ID: mdl-32937939

ABSTRACT

Automated robotic platforms are an important part of precision agriculture solutions for sustainable food production. Agri-robots require robust and accurate guidance systems in order to navigate between crops and to and from their base station. Onboard sensors such as machine vision cameras offer a flexible guidance alternative to more expensive solutions for structured environments such as scanning lidar or RTK-GNSS. The main challenges for visual crop row guidance are the dramatic differences in appearance of crops between farms and throughout the season and the variations in crop spacing and contours of the crop rows. Here we present a visual guidance pipeline for an agri-robot operating in strawberry fields in Norway that is based on semantic segmentation with a convolution neural network (CNN) to segment input RGB images into crop and not-crop (i.e., drivable terrain) regions. To handle the uneven contours of crop rows in Norway's hilly agricultural regions, we develop a new adaptive multi-ROI method for fitting trajectories to the drivable regions. We test our approach in open-loop trials with a real agri-robot operating in the field and show that our approach compares favourably to other traditional guidance approaches.

2.
J Exp Biol ; 218(Pt 19): 3118-27, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26276861

ABSTRACT

When using virtual-reality paradigms to study animal behaviour, careful attention must be paid to how the animal's actions are detected. This is particularly relevant in closed-loop experiments where the animal interacts with a stimulus. Many different sensor types have been used to measure aspects of behaviour, and although some sensors may be more accurate than others, few studies have examined whether, and how, such differences affect an animal's behaviour in a closed-loop experiment. To investigate this issue, we conducted experiments with tethered honeybees walking on an air-supported trackball and fixating a visual object in closed-loop. Bees walked faster and along straighter paths when the motion of the trackball was measured in the classical fashion - using optical motion sensors repurposed from computer mice - than when measured more accurately using a computer vision algorithm called 'FicTrac'. When computer mouse sensors were used to measure bees' behaviour, the bees modified their behaviour and achieved improved control of the stimulus. This behavioural change appears to be a response to a systematic error in the computer mouse sensor that reduces the sensitivity of this sensor system under certain conditions. Although the large perceived inertia and mass of the trackball relative to the honeybee is a limitation of tethered walking paradigms, observing differences depending on the sensor system used to measure bee behaviour was not expected. This study suggests that bees are capable of fine-tuning their motor control to improve the outcome of the task they are performing. Further, our findings show that caution is required when designing virtual-reality experiments, as animals can potentially respond to the artificial scenario in unexpected and unintended ways.


Subject(s)
Bees/physiology , Computer Simulation , Algorithms , Animals , Behavior, Animal/physiology , Optical Devices , Walking/physiology
3.
Proc Natl Acad Sci U S A ; 111(13): 5006-11, 2014 Apr 01.
Article in English | MEDLINE | ID: mdl-24639490

ABSTRACT

Attention allows animals to respond selectively to competing stimuli, enabling some stimuli to evoke a behavioral response while others are ignored. How the brain does this remains mysterious, although it is increasingly evident that even animals with the smallest brains display this capacity. For example, insects respond selectively to salient visual stimuli, but it is unknown where such selectivity occurs in the insect brain, or whether neural correlates of attention might predict the visual choices made by an insect. Here, we investigate neural correlates of visual attention in behaving honeybees (Apis mellifera). Using a closed-loop paradigm that allows tethered, walking bees to actively control visual objects in a virtual reality arena, we show that behavioral fixation increases neuronal responses to flickering, frequency-tagged stimuli. Attention-like effects were reduced in the optic lobes during replay of the same visual sequences, when bees were not able to control the visual displays. When bees were presented with competing frequency-tagged visual stimuli, selectivity in the medulla (an optic ganglion) preceded behavioral selection of a stimulus, suggesting that modulation of early visual processing centers precedes eventual behavioral choices made by these insects.


Subject(s)
Attention/physiology , Bees/physiology , Choice Behavior/physiology , Optic Lobe, Nonmammalian/physiology , Animals , Behavior, Animal , Brain/physiology , Evoked Potentials, Visual/physiology , Fixation, Ocular/physiology , Honey , Photic Stimulation , Walking/physiology
4.
J Neurosci Methods ; 225: 106-19, 2014 Mar 30.
Article in English | MEDLINE | ID: mdl-24491637

ABSTRACT

Studying how animals interface with a virtual reality can further our understanding of how attention, learning and memory, sensory processing, and navigation are handled by the brain, at both the neurophysiological and behavioural levels. To this end, we have developed a novel vision-based tracking system, FicTrac (Fictive path Tracking software), for estimating the path an animal makes whilst rotating an air-supported sphere using only input from a standard camera and computer vision techniques. We have found that the accuracy and robustness of FicTrac outperforms a low-cost implementation of a standard optical mouse-based approach for generating fictive paths. FicTrac is simple to implement for a wide variety of experimental configurations and, importantly, is fast to execute, enabling real-time sensory feedback for behaving animals. We have used FicTrac to record the behaviour of tethered honeybees, Apis mellifera, whilst presenting visual stimuli in both open-loop and closed-loop experimental paradigms. We found that FicTrac could accurately register the fictive paths of bees as they walked towards bright green vertical bars presented on an LED arena. Using FicTrac, we have demonstrated closed-loop visual fixation in both the honeybee and the fruit fly, Drosophila melanogaster, establishing the flexibility of this system. FicTrac provides the experimenter with a simple yet adaptable system that can be combined with electrophysiological recording techniques to study the neural mechanisms of behaviour in a variety of organisms, including walking vertebrates.


Subject(s)
Behavior, Animal/physiology , Imaging, Three-Dimensional/methods , Optical Imaging/methods , Software , Animals , Bees , Movement/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...