Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
Add more filters










Publication year range
1.
Neural Netw ; 179: 106527, 2024 Jul 09.
Article in English | MEDLINE | ID: mdl-39029298

ABSTRACT

A novel coronavirus discovered in late 2019 (COVID-19) quickly spread into a global epidemic and, thankfully, was brought under control by 2022. Because of the virus's unknown mutations and the vaccine's waning potency, forecasting is still essential for resurgence prevention and medical resource management. Computational efficiency and long-term accuracy are two bottlenecks for national-level forecasting. This study develops a novel multivariate time series forecasting model, the densely connected highly flexible dendritic neuron model (DFDNM) to predict daily and weekly positive COVID-19 cases. DFDNM's high flexibility mechanism improves its capacity to deal with nonlinear challenges. The dense introduction of shortcut connections alleviates the vanishing and exploding gradient problems, encourages feature reuse, and improves feature extraction. To deal with the rapidly growing parameters, an improved variation of the adaptive moment estimation (AdamW) algorithm is employed as the learning algorithm for the DFDNM because of its strong optimization ability. The experimental results and statistical analysis conducted across three Japanese prefectures confirm the efficacy and feasibility of the DFDNM while outperforming various state-of-the-art machine learning models. To the best of our knowledge, the proposed DFDNM is the first to restructure the dendritic neuron model's neural architecture, demonstrating promising use in multivariate time series prediction. Because of its optimal performance, the DFDNM may serve as an important reference for national and regional government decision-makers aiming to optimize pandemic prevention and medical resource management. We also verify that DFDMN is efficiently applicable not only to COVID-19 transmission prediction, but also to more general multivariate prediction tasks. It leads us to believe that it might be applied as a promising prediction model in other fields.

2.
Front Neurosci ; 17: 1229275, 2023.
Article in English | MEDLINE | ID: mdl-37674518

ABSTRACT

Orientation detection is an essential function of the visual system. In our previous works, we have proposed a new orientation detection mechanism based on local orientation-selective neurons. We assume that there are neurons solely responsible for orientation detection, with each neuron dedicated to detecting a specific local orientation. The global orientation is inferred from the local orientation information. Based on this mechanism, we propose an artificial visual system (AVS) by utilizing a single-layer of McCulloch-Pitts neurons to realize these local orientation-sensitive neurons and a layer of sum pooling to realize global orientation detection neurons. We demonstrate that such a single-layer perceptron artificial visual system (AVS) is capable of detecting global orientation by identifying the orientation with the largest number of activated orientation-selective neurons as the global orientation. To evaluate the effectiveness of this single-layer perceptron AVS, we perform computer simulations. The results show that the AVS works perfectly for global orientation detection, aligning with the majority of physiological experiments and models. Moreover, we compare the performance of the single-layer perceptron AVS with that of a traditional convolutional neural network (CNN) on orientation detection tasks. We find that the single-layer perceptron AVS outperforms CNN in various aspects, including identification accuracy, noise resistance, computational and learning cost, hardware implementation feasibility, and biological plausibility.

3.
Sci Rep ; 13(1): 12744, 2023 Aug 07.
Article in English | MEDLINE | ID: mdl-37550464

ABSTRACT

Slime mold algorithm (SMA) is a nature-inspired algorithm that simulates the biological optimization mechanisms and has achieved great results in various complex stochastic optimization problems. Owing to the simulated biological search principle of slime mold, SMA has a unique advantage in global optimization problem. However, it still suffers from issues of missing the optimal solution or collapsing to local optimum when facing complicated problems. To conquer these drawbacks, we consider adding a novel multi-chaotic local operator to the bio-shock feedback mechanism of SMA to compensate for the lack of exploration of the local solution space with the help of the perturbation nature of the chaotic operator. Based on this, we propose an improved algorithm, namely MCSMA, by investigating how to improve the probabilistic selection of chaotic operators based on the maximum Lyapunov exponent (MLE), an inherent property of chaotic maps. We implement the comparison between MCSMA with other state-of-the-art methods on IEEE Congress on Evolution Computation (CEC) i.e., CEC2017 benchmark test suits and CEC2011 practical problems to demonstrate its potency and perform dendritic neuron model training to test the robustness of MCSMA on classification problems. Finally, the parameters' sensitivities of MCSMA, the utilization of the solution space, and the effectiveness of the MLE are adequately discussed.

4.
Comput Intell Neurosci ; 2023: 7037124, 2023.
Article in English | MEDLINE | ID: mdl-36726357

ABSTRACT

Deep learning (DL) has achieved breakthrough successes in various tasks, owing to its layer-by-layer information processing and sufficient model complexity. However, DL suffers from the issues of both redundant model complexity and low interpretability, which are mainly because of its oversimplified basic McCulloch-Pitts neuron unit. A widely recognized biologically plausible dendritic neuron model (DNM) has demonstrated its effectiveness in alleviating the aforementioned issues, but it can only solve binary classification tasks, which significantly limits its applicability. In this study, a novel extended network based on the dendritic structure is innovatively proposed, thereby enabling it to solve multiple-class classification problems. Also, for the first time, an efficient error-back-propagation learning algorithm is derived. In the extensive experimental results, the effectiveness and superiority of the proposed method in comparison with other nine state-of-the-art classifiers on ten datasets are demonstrated, including a real-world quality of web service application. The experimental results suggest that the proposed learning algorithm is competent and reliable in terms of classification performance and stability and has a notable advantage in small-scale disequilibrium data. Additionally, aspects of network structure constrained by scale are examined.


Subject(s)
Algorithms , Neurons , Neurons/physiology , Software
5.
IEEE Trans Neural Netw Learn Syst ; 34(4): 2105-2118, 2023 04.
Article in English | MEDLINE | ID: mdl-34487498

ABSTRACT

A single dendritic neuron model (DNM) that owns the nonlinear information processing ability of dendrites has been widely used for classification and prediction. Complex-valued neural networks that consist of a number of multiple/deep-layer McCulloch-Pitts neurons have achieved great successes so far since neural computing was utilized for signal processing. Yet no complex value representations appear in single neuron architectures. In this article, we first extend DNM from a real-value domain to a complex-valued one. Performance of complex-valued DNM (CDNM) is evaluated through a complex XOR problem, a non-minimum phase equalization problem, and a real-world wind prediction task. Also, a comparative analysis on a set of elementary transcendental functions as an activation function is implemented and preparatory experiments are carried out for determining hyperparameters. The experimental results indicate that the proposed CDNM significantly outperforms real-valued DNM, complex-valued multi-layer perceptron, and other complex-valued neuron models.


Subject(s)
Neural Networks, Computer , Neurons , Signal Processing, Computer-Assisted , Algorithms
6.
Brain Sci ; 12(4)2022 Apr 01.
Article in English | MEDLINE | ID: mdl-35448001

ABSTRACT

The Hubel-Wiesel (HW) model is a classical neurobiological model for explaining the orientation selectivity of cortical cells. However, the HW model still has not been fully proved physiologically, and there are few concise but efficient systems to quantify and simulate the HW model and can be used for object orientation detection applications. To realize a straightforward and efficient quantitive method and validate the HW model's reasonability and practicality, we use McCulloch-Pitts (MP) neuron model to simulate simple cells and complex cells and implement an artificial visual system (AVS) for two-dimensional object orientation detection. First, we realize four types of simple cells that are only responsible for detecting a specific orientation angle locally. Complex cells are realized with the sum function. Every local orientation information of an object is collected by simple cells and subsequently converged to the corresponding same type complex cells for computing global activation degree. Finally, the global orientation is obtained according to the activation degree of each type of complex cell. Based on this scheme, an AVS for global orientation detection is constructed. We conducted computer simulations to prove the feasibility and effectiveness of our scheme and the AVS. Computer simulations show that the mechanism-based AVS can make accurate orientation discrimination and shows striking biological similarities with the natural visual system, which indirectly proves the rationality of the Hubel-Wiesel model. Furthermore, compared with traditional CNN, we find that our AVS beats CNN on orientation detection tasks in identification accuracy, noise resistance, computation and learning cost, hardware implementation, and reasonability.

7.
Comput Intell Neurosci ; 2021: 5227377, 2021.
Article in English | MEDLINE | ID: mdl-34966420

ABSTRACT

Microarray gene expression data provide a prospective way to diagnose disease and classify cancer. However, in bioinformatics, the gene selection problem, i.e., how to select the most informative genes from thousands of genes, remains challenging. This problem is a specific feature selection problem with high-dimensional features and small sample sizes. In this paper, a two-stage method combining a filter feature selection method and a wrapper feature selection method is proposed to solve the gene selection problem. In contrast to common methods, the proposed method models the gene selection problem as a multiobjective optimization problem. Both stages employ the same multiobjective differential evolution (MODE) as the search strategy but incorporate different objective functions. The three objective functions of the filter method are mainly based on mutual information. The two objective functions of the wrapper method are the number of selected features and the classification error of a naive Bayes (NB) classifier. Finally, the performance of the proposed method is tested and analyzed on six benchmark gene expression datasets. The experimental results verified that this paper provides a novel and effective way to solve the gene selection problem by applying a multiobjective optimization algorithm.


Subject(s)
Algorithms , Computational Biology , Bayes Theorem , Gene Expression , Prospective Studies
8.
IEEE Trans Neural Netw Learn Syst ; 32(11): 5194-5207, 2021 11.
Article in English | MEDLINE | ID: mdl-33156795

ABSTRACT

An approximate logic neural model (ALNM) is a novel single-neuron model with plastic dendritic morphology. During the training process, the model can eliminate unnecessary synapses and useless branches of dendrites. It will produce a specific dendritic structure for a particular task. The simplified structure of ALNM can be substituted by a logic circuit classifier (LCC) without losing any essential information. The LCC merely consists of the comparator and logic NOT, AND, and OR gates. Thus, it can be easily implemented in hardware. However, the architecture of ALNM affects the learning capacity, generalization capability, computing time and approximation of LCC. Thus, a Pareto-based multiobjective differential evolution (MODE) algorithm is proposed to simultaneously optimize ALNM's topology and weights. MODE can generate a concise and accurate LCC for every specific task from ALNM. To verify the effectiveness of MODE, extensive experiments are performed on eight benchmark classification problems. The statistical results demonstrate that MODE is superior to conventional learning methods, such as the backpropagation algorithm and single-objective evolutionary algorithms. In addition, compared against several commonly used classifiers, both ALNM and LCC are capable of obtaining promising and competitive classification performances on the benchmark problems. Besides, the experimental results also verify that the LCC obtains the faster classification speed than the other classifiers.


Subject(s)
Algorithms , Databases, Factual/standards , Logic , Neural Networks, Computer , Dendrites/physiology , Humans , Neuronal Plasticity/physiology , Reproducibility of Results , Synapses/physiology
9.
Comput Intell Neurosci ; 2020: 2710561, 2020.
Article in English | MEDLINE | ID: mdl-32405292

ABSTRACT

A dendritic neuron model with adaptive synapses (DMASs) based on differential evolution (DE) algorithm training is proposed. According to the signal transmission order, a DNM can be divided into four parts: the synaptic layer, dendritic layer, membrane layer, and somatic cell layer. It can be converted to a logic circuit that is easily implemented on hardware by removing useless synapses and dendrites after training. This logic circuit can be designed to solve complex nonlinear problems using only four basic logical devices: comparators, AND (conjunction), OR (disjunction), and NOT (negation). To obtain a faster and better solution, we adopt the most popular DE for DMAS training. We have chosen five classification datasets from the UCI Machine Learning Repository for an experiment. We analyze and discuss the experimental results in terms of the correct rate, convergence rate, ROC curve, and the cross-validation and then compare the results with a dendritic neuron model trained by the backpropagation algorithm (BP-DNM) and a neural network trained by the backpropagation algorithm (BPNN). The analysis results show that the DE-DMAS shows better performance in all aspects.


Subject(s)
Algorithms , Dendrites , Models, Neurological , Neural Networks, Computer , Synapses , Animals , Humans
10.
Comput Intell Neurosci ; 2019: 7362931, 2019.
Article in English | MEDLINE | ID: mdl-31485216

ABSTRACT

By employing a neuron plasticity mechanism, the original dendritic neuron model (DNM) has been succeeded in the classification tasks with not only an encouraging accuracy but also a simple learning rule. However, the data collected in real world contain a lot of redundancy, which causes the process of analyzing data by DNM become complicated and time-consuming. This paper proposes a reliable hybrid model which combines a maximum relevance minimum redundancy (Mr2) feature selection technique with DNM (namely, Mr2DNM) for classifying the practical classification problems. The mutual information-based Mr2 is applied to evaluate and rank the most informative and discriminative features for the given dataset. The obtained optimal feature subset is used to train and test the DNM for classifying five different problems arisen from medical, physical, and social scenarios. Experimental results suggest that the proposed Mr2DNM outperforms DNM and other six classification algorithms in terms of accuracy and computational efficiency.


Subject(s)
Algorithms , Neuronal Plasticity/physiology , Neurons/physiology , Dendritic Cells/physiology , Models, Biological , Support Vector Machine
11.
Int J Neural Syst ; 29(8): 1950012, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31189391

ABSTRACT

Neurons are the fundamental units of the brain and nervous system. Developing a good modeling of human neurons is very important not only to neurobiology but also to computer science and many other fields. The McCulloch and Pitts neuron model is the most widely used neuron model, but has long been criticized as being oversimplified in view of properties of real neuron and the computations they perform. On the other hand, it has become widely accepted that dendrites play a key role in the overall computation performed by a neuron. However, the modeling of the dendritic computations and the assignment of the right synapses to the right dendrite remain open problems in the field. Here, we propose a novel dendritic neural model (DNM) that mimics the essence of known nonlinear interaction among inputs to the dendrites. In the model, each input is connected to branches through a distance-dependent nonlinear synapse, and each branch performs a simple multiplication on the inputs. The soma then sums the weighted products from all branches and produces the neuron's output signal. We show that the rich nonlinear dendritic response and the powerful nonlinear neural computational capability, as well as many known neurobiological phenomena of neurons and dendrites, may be understood and explained by the DNM. Furthermore, we show that the model is capable of learning and developing an internal structure, such as the location of synapses in the dendritic branch and the type of synapses, that is appropriate for a particular task - for example, the linearly nonseparable problem, a real-world benchmark problem - Glass classification and the directional selectivity problem.


Subject(s)
Dendrites , Models, Neurological , Neurons , Nonlinear Dynamics , Synapses , Machine Learning
12.
Comput Intell Neurosci ; 2018: 9390410, 2018.
Article in English | MEDLINE | ID: mdl-29606961

ABSTRACT

Nowadays, credit classification models are widely applied because they can help financial decision-makers to handle credit classification issues. Among them, artificial neural networks (ANNs) have been widely accepted as the convincing methods in the credit industry. In this paper, we propose a pruning neural network (PNN) and apply it to solve credit classification problem by adopting the well-known Australian and Japanese credit datasets. The model is inspired by synaptic nonlinearity of a dendritic tree in a biological neural model. And it is trained by an error back-propagation algorithm. The model is capable of realizing a neuronal pruning function by removing the superfluous synapses and useless dendrites and forms a tidy dendritic morphology at the end of learning. Furthermore, we utilize logic circuits (LCs) to simulate the dendritic structures successfully which makes PNN be implemented on the hardware effectively. The statistical results of our experiments have verified that PNN obtains superior performance in comparison with other classical algorithms in terms of accuracy and computational efficiency.


Subject(s)
Algorithms , Neural Networks, Computer , Humans
13.
IEEE/ACM Trans Comput Biol Bioinform ; 15(4): 1365-1378, 2018.
Article in English | MEDLINE | ID: mdl-28534784

ABSTRACT

The problem of predicting the three-dimensional (3-D) structure of a protein from its one-dimensional sequence has been called the "holy grail of molecular biology", and it has become an important part of structural genomics projects. Despite the rapid developments in computer technology and computational intelligence, it remains challenging and fascinating. In this paper, to solve it we propose a multi-objective evolutionary algorithm. We decompose the protein energy function Chemistry at HARvard Macromolecular Mechanics force fields into bond and non-bond energies as the first and second objectives. Considering the effect of solvent, we innovatively adopt a solvent-accessible surface area as the third objective. We use 66 benchmark proteins to verify the proposed method and obtain better or competitive results in comparison with the existing methods. The results suggest the necessity to incorporate the effect of solvent into a multi-objective evolutionary algorithm to improve protein structure prediction in terms of accuracy and efficiency.


Subject(s)
Algorithms , Computational Biology/methods , Protein Conformation , Proteins , Databases, Protein , Hydrophobic and Hydrophilic Interactions , Models, Molecular , Proteins/chemistry , Proteins/genetics , Solvents , Water
14.
Neural Netw ; 60: 96-103, 2014 Dec.
Article in English | MEDLINE | ID: mdl-25170564

ABSTRACT

Recent researches have provided strong circumstantial support to dendrites playing a key and possibly essential role in computations. In this paper, we propose an unsupervised learnable neuron model by including the nonlinear interactions between excitation and inhibition on dendrites. The model neuron self-adjusts its synaptic parameters, so that the synapse to dendrite, according to a generalized delta-rule-like algorithm. The model is used to simulate directionally selective cells by the unsupervised learning algorithm. In the simulations, we initialize the interaction and dendrite of the neuron randomly and use the generalized delta-rule-like unsupervised learning algorithm to learn the two-dimensional multi-directional selectivity problem without an external teacher's signals. Simulation results show that the directionally selective cells can be formed by unsupervised learning, acquiring the required number of dendritic branches, and if needed, enhanced and if not, eliminated. Further, the results show whether a synapse exists; if it exists, where and what type (excitatory or inhibitory) of synapse it is. This leads us to believe that the proposed neuron model may be considerably more powerful on computations than the McCulloch-Pitts model because theoretically a single neuron or a single layer of such neurons is capable of solving any complex problem. These may also lead to a completely new technique for analyzing the mechanisms and principles of neurons, dendrites, and synapses.


Subject(s)
Dendrites/physiology , Learning , Models, Neurological , Synapses/physiology , Algorithms , Computer Simulation , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...