ABSTRACT
To improve the rapidity of path planning for drones in unknown environments, a new bio-inspired path planning method using E-DQN (event-based deep Q-network), referring to introducing event stream to reinforcement learning network, is proposed. Firstly, event data are collected through an airsim simulator for environmental perception, and an auto-encoder is presented to extract data features and generate event weights. Then, event weights are input into DQN (deep Q-network) to choose the action of the next step. Finally, simulation and verification experiments are conducted in a virtual obstacle environment built with an unreal engine and airsim. The experiment results show that the proposed algorithm is adaptable for drones to find the goal in unknown environments and can improve the rapidity of path planning compared with that of commonly used methods.
ABSTRACT
Introduction: With the development of artificial intelligence and brain science, brain-inspired navigation and path planning has attracted widespread attention. Methods: In this paper, we present a place cell based path planning algorithm that utilizes spiking neural network (SNN) to create efficient routes for drones. First, place cells are characterized by the leaky integrate-and-fire (LIF) neuron model. Then, the connection weights between neurons are trained by spike-timing-dependent plasticity (STDP) learning rules. Afterwards, a synaptic vector field is created to avoid obstacles and to find the shortest path. Results: Finally, simulation experiments both in a Python simulation environment and in an Unreal Engine environment are conducted to evaluate the validity of the algorithms. Discussion: Experiment results demonstrate the validity, its robustness and the computational speed of the proposed model.