Your browser doesn't support javascript.
loading
Direct training high-performance deep spiking neural networks: a review of theories and methods.
Zhou, Chenlin; Zhang, Han; Yu, Liutao; Ye, Yumin; Zhou, Zhaokun; Huang, Liwei; Ma, Zhengyu; Fan, Xiaopeng; Zhou, Huihui; Tian, Yonghong.
Affiliation
  • Zhou C; Peng Cheng Laboratory, Shenzhen, China.
  • Zhang H; Peng Cheng Laboratory, Shenzhen, China.
  • Yu L; Faculty of Computing, Harbin Institute of Technology, Harbin, China.
  • Ye Y; Peng Cheng Laboratory, Shenzhen, China.
  • Zhou Z; Peng Cheng Laboratory, Shenzhen, China.
  • Huang L; Peng Cheng Laboratory, Shenzhen, China.
  • Ma Z; School of Electronic and Computer Engineering, Shenzhen Graduate School, Peking University, Shenzhen, China.
  • Fan X; Peng Cheng Laboratory, Shenzhen, China.
  • Zhou H; National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University, Beijing, China.
  • Tian Y; Peng Cheng Laboratory, Shenzhen, China.
Front Neurosci ; 18: 1383844, 2024.
Article in En | MEDLINE | ID: mdl-39145295
ABSTRACT
Spiking neural networks (SNNs) offer a promising energy-efficient alternative to artificial neural networks (ANNs), in virtue of their high biological plausibility, rich spatial-temporal dynamics, and event-driven computation. The direct training algorithms based on the surrogate gradient method provide sufficient flexibility to design novel SNN architectures and explore the spatial-temporal dynamics of SNNs. According to previous studies, the performance of models is highly dependent on their sizes. Recently, direct training deep SNNs have achieved great progress on both neuromorphic datasets and large-scale static datasets. Notably, transformer-based SNNs show comparable performance with their ANN counterparts. In this paper, we provide a new perspective to summarize the theories and methods for training deep SNNs with high performance in a systematic and comprehensive way, including theory fundamentals, spiking neuron models, advanced SNN models and residual architectures, software frameworks and neuromorphic hardware, applications, and future trends.
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Front Neurosci / Front. neurosci. (Online) / Frontiers in neuroscience (Print) Year: 2024 Document type: Article Affiliation country: China Country of publication: Switzerland

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Front Neurosci / Front. neurosci. (Online) / Frontiers in neuroscience (Print) Year: 2024 Document type: Article Affiliation country: China Country of publication: Switzerland