Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add more filters










Database
Language
Publication year range
1.
Neural Netw ; 140: 237-246, 2021 Aug.
Article in English | MEDLINE | ID: mdl-33794415

ABSTRACT

To adapt to the reality of limited computing resources of various terminal devices in industrial applications, a randomized neural network called stochastic configuration network (SCN), which can conduct effective training without GPU, was proposed. SCN uses a supervisory random mechanism to assign its input weights and hidden biases, which makes it more stable than other randomized algorithms but also leads to time-consuming model training. To alleviate this problem, we propose a novel bidirectional SCN algorithm (BSCN) in this paper, which divides the way of adding hidden nodes into two modes: forward learning and backward learning. In the forward learning mode, BSCN still uses the supervisory mechanism to configure the parameters of the newly added nodes, which is the same as SCN. In the backward learning mode, BSCN calculates the parameters at one time based on the residual error feedback of the current model. The two learning modes are performed iteratively until the prediction error of the model reaches an acceptable level or the number of hidden nodes reaches its maximum value. This semi-random learning mechanism greatly speeds up the training efficiency of the BSCN model and significantly improves the quality of the hidden nodes. Extensive experiments on ten benchmark regression problems, two real-life air pollution prediction problems, and a classical image processing problem show that BSCN can achieve faster training speed, higher stability, and better generalization ability than SCN.


Subject(s)
Supervised Machine Learning , Random Allocation , Regression Analysis , Stochastic Processes
SELECTION OF CITATIONS
SEARCH DETAIL
...