Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Neural Netw ; 159: 208-219, 2023 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-36657226

RESUMO

As the scales of neural networks increase, techniques that enable them to run with low computational cost and energy efficiency are required. From such demands, various efficient neural network paradigms, such as spiking neural networks (SNNs) or binary neural networks (BNNs), have been proposed. However, they have sticky drawbacks, such as degraded inference accuracy and latency. To solve these problems, we propose a single-step spiking neural network (S3NN), an energy-efficient neural network with low computational cost and high precision. The proposed S3NN processes the information between hidden layers by spikes as SNNs. Nevertheless, it has no temporal dimension so that there is no latency within training and inference phases as BNNs. Thus, the proposed S3NN has a lower computational cost than SNNs that require time-series processing. However, S3NN cannot adopt naïve backpropagation algorithms due to the non-differentiability nature of spikes. We deduce a suitable neuron model by reducing the surrogate gradient for multi-time step SNNs to a single-time step. We experimentally demonstrated that the obtained surrogate gradient allows S3NN to be trained appropriately. We also showed that the proposed S3NN could achieve comparable accuracy to full-precision networks while being highly energy-efficient.


Assuntos
Conservação de Recursos Energéticos , Modelos Neurológicos , Redes Neurais de Computação , Algoritmos , Neurônios/fisiologia
2.
Sensors (Basel) ; 22(8)2022 Apr 08.
Artigo em Inglês | MEDLINE | ID: mdl-35458860

RESUMO

Biologically inspired spiking neural networks (SNNs) are widely used to realize ultralow-power energy consumption. However, deep SNNs are not easy to train due to the excessive firing of spiking neurons in the hidden layers. To tackle this problem, we propose a novel but simple normalization technique called postsynaptic potential normalization. This normalization removes the subtraction term from the standard normalization and uses the second raw moment instead of the variance as the division term. The spike firing can be controlled, enabling the training to proceed appropriately, by conducting this simple normalization to the postsynaptic potential. The experimental results show that SNNs with our normalization outperformed other models using other normalizations. Furthermore, through the pre-activation residual blocks, the proposed model can train with more than 100 layers without other special techniques dedicated to SNNs.


Assuntos
Recuperação Demorada da Anestesia , Humanos , Redes Neurais de Computação , Neurônios/fisiologia
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...