Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Front Neurosci ; 17: 1121592, 2023.
Article in English | MEDLINE | ID: mdl-37214405

ABSTRACT

Spiking neural networks (SNNs), which are a form of neuromorphic, brain-inspired AI, have the potential to be a power-efficient alternative to artificial neural networks (ANNs). Spikes that occur in SNN systems, also known as activations, tend to be extremely sparse, and low in number. This minimizes the number of data accesses typically needed for processing. In addition, SNN systems are typically designed to use addition operations which consume much less energy than the typical multiply and accumulate operations used in DNN systems. The vast majority of neuromorphic hardware designs support rate-based SNNs, where the information is encoded by spike rates. Generally, rate-based SNNs can be inefficient as a large number of spikes will be transmitted and processed during inference. One coding scheme that has the potential to improve efficiency is the time-to-first-spike (TTFS) coding, where the information isn't presented through the frequency of spikes, but instead through the relative spike arrival time. In TTFS-based SNNs, each neuron can only spike once during the entire inference process, and this results in high sparsity. The activation sparsity of TTFS-based SNNs is higher than rate-based SNNs, but TTFS-based SNNs have yet to achieve the same accuracy as rate-based SNNs. In this work, we propose two key improvements for TTFS-based SNN systems: (1) a novel optimization algorithm to improve the accuracy of TTFS-based SNNs and (2) a novel hardware accelerator for TTFS-based SNNs that uses a scalable and low-power design. Our work in TTFS coding and training improves the accuracy of TTFS-based SNNs to achieve state-of-the-art results on the MNIST and Fashion-MNIST datasets. Meanwhile, our work reduces the power consumption by at least 2.4×, 25.9×, and 38.4× over the state-of-the-art neuromorphic hardware on MNIST, Fashion-MNIST, and CIFAR10, respectively.

2.
IEEE Trans Neural Netw Learn Syst ; 33(5): 1947-1958, 2022 05.
Article in English | MEDLINE | ID: mdl-34534091

ABSTRACT

Spiking neural networks (SNNs) use spatiotemporal spike patterns to represent and transmit information, which are not only biologically realistic but also suitable for ultralow-power event-driven neuromorphic implementation. Just like other deep learning techniques, deep SNNs (DeepSNNs) benefit from the deep architecture. However, the training of DeepSNNs is not straightforward because the well-studied error backpropagation (BP) algorithm is not directly applicable. In this article, we first establish an understanding as to why error BP does not work well in DeepSNNs. We then propose a simple yet efficient rectified linear postsynaptic potential function (ReL-PSP) for spiking neurons and a spike-timing-dependent BP (STDBP) learning algorithm for DeepSNNs where the timing of individual spikes is used to convey information (temporal coding), and learning (BP) is performed based on spike timing in an event-driven manner. We show that DeepSNNs trained with the proposed single spike time-based learning algorithm can achieve the state-of-the-art classification accuracy. Furthermore, by utilizing the trained model parameters obtained from the proposed STDBP learning algorithm, we demonstrate ultralow-power inference operations on a recently proposed neuromorphic inference accelerator. The experimental results also show that the neuromorphic hardware consumes 0.751 mW of the total power consumption and achieves a low latency of 47.71 ms to classify an image from the Modified National Institute of Standards and Technology (MNIST) dataset. Overall, this work investigates the contribution of spike timing dynamics for information encoding, synaptic plasticity, and decision-making, providing a new perspective to the design of future DeepSNNs and neuromorphic hardware.


Subject(s)
Algorithms , Neural Networks, Computer , Neuronal Plasticity , Neurons/physiology , Synaptic Potentials
SELECTION OF CITATIONS
SEARCH DETAIL
...