Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Curr Issues Mol Biol ; 44(2): 817-832, 2022 Feb 07.
Article in English | MEDLINE | ID: mdl-35723341

ABSTRACT

Large-scale artificial neural networks have many redundant structures, making the network fall into the issue of local optimization and extended training time. Moreover, existing neural network topology optimization algorithms have the disadvantage of many calculations and complex network structure modeling. We propose a Dynamic Node-based neural network Structure optimization algorithm (DNS) to handle these issues. DNS consists of two steps: the generation step and the pruning step. In the generation step, the network generates hidden layers layer by layer until accuracy reaches the threshold. Then, the network uses a pruning algorithm based on Hebb's rule or Pearson's correlation for adaptation in the pruning step. In addition, we combine genetic algorithm to optimize DNS (GA-DNS). Experimental results show that compared with traditional neural network topology optimization algorithms, GA-DNS can generate neural networks with higher construction efficiency, lower structure complexity, and higher classification accuracy.

2.
Brain Sci ; 12(2)2022 Jan 21.
Article in English | MEDLINE | ID: mdl-35203904

ABSTRACT

Small sample learning ability is one of the most significant characteristics of the human brain. However, its mechanism is yet to be fully unveiled. In recent years, brain-inspired artificial intelligence has become a very hot research domain. Researchers explored brain-inspired technologies or architectures to construct neural networks that could achieve human-alike intelligence. In this work, we presented our effort at evaluation of the effect of dynamic behavior and topology co-learning of neurons and synapses on the small sample learning ability of spiking neural network. Results show that the dynamic behavior and topology co-learning mechanism of neurons and synapses presented in our work could significantly reduce the number of required samples, while maintaining a reasonable performance on the MNIST data-set, resulting in a very lightweight neural network structure.

3.
Article in English | MEDLINE | ID: mdl-33801219

ABSTRACT

It is very important to have a comprehensive understanding of the health status of a country's population, which helps to develop corresponding public health policies. Correct inference of the underlying cause-of-death for citizens is essential to achieve a comprehensive understanding of the health status of a country's population. Traditionally, this relies mainly on manual methods based on medical staff's experiences, which require a lot of resources and is not very efficient. In this work, we present our efforts to construct an automatic method to perform inferences of the underlying causes-of-death for citizens. A sink algorithm is introduced, which could perform automatic inference of the underlying cause-of-death for citizens. The results show that our sink algorithm could generate a reasonable output and outperforms other stat-of-the-art algorithms. We believe it would be very useful to greatly enhance the efficiency of correct inferences of the underlying causes-of-death for citizens.


Subject(s)
Algorithms , Public Policy , Humans
4.
Brain Sci ; 11(2)2021 Jan 25.
Article in English | MEDLINE | ID: mdl-33503833

ABSTRACT

In neuroscience, the Default Mode Network (DMN), also known as the default network or the default-state network, is a large-scale brain network known to have highly correlated activities that are distinct from other networks in the brain. Many studies have revealed that DMNs can influence other cognitive functions to some extent. This paper is motivated by this idea and intends to further explore on how DMNs could help Spiking Neural Networks (SNNs) on image classification problems through an experimental study. The approach emphasizes the bionic meaning on model selection and parameters settings. For modeling, we select Leaky Integrate-and-Fire (LIF) as the neuron model, Additive White Gaussian Noise (AWGN) as the input DMN, and design the learning algorithm based on Spike-Timing-Dependent Plasticity (STDP). Then, we experiment on a two-layer SNN to evaluate the influence of DMN on classification accuracy, and on a three-layer SNN to examine the influence of DMN on structure evolution, where the results both appear positive. Finally, we discuss possible directions for future works.

SELECTION OF CITATIONS
SEARCH DETAIL
...