Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Article in English | MEDLINE | ID: mdl-38809742

ABSTRACT

Echo state networks (ESNs) are time series processing models working under the echo state property (ESP) principle. The ESP is a notion of stability that imposes an asymptotic fading of the memory of the input. On the other hand, the resulting inherent architectural bias of ESNs may lead to an excessive loss of information, which in turn harms the performance in certain tasks with long short-term memory requirements. To bring together the fading memory property and the ability to retain as much memory as possible, in this article, we introduce a new ESN architecture called the Edge of Stability ESN (). The introduced model is based on defining the reservoir layer as a convex combination of a nonlinear reservoir (as in the standard ESN), and a linear reservoir that implements an orthogonal transformation. In virtue of a thorough mathematical analysis, we prove that the whole eigenspectrum of the Jacobian of the map can be contained in an annular neighborhood of a complex circle of controllable radius. This property is exploited to tune the 's dynamics close to the edge-of-chaos regime by design. Remarkably, our experimental analysis shows that model can reach the theoretical maximum short-term memory capacity (MC). At the same time, in comparison to conventional reservoir approaches, is shown to offer an excellent trade-off between memory and nonlinearity, as well as a significant improvement of performance in autoregressive nonlinear modeling and real-world time series modeling.

2.
Neural Netw ; 108: 33-47, 2018 Dec.
Article in English | MEDLINE | ID: mdl-30138751

ABSTRACT

In this paper, we provide a novel approach to the architectural design of deep Recurrent Neural Networks using signal frequency analysis. In particular, focusing on the Reservoir Computing framework and inspired by the principles related to the inherent effect of layering, we address a fundamental open issue in deep learning, namely the question of how to establish the number of layers in recurrent architectures in the form of deep echo state networks (DeepESNs). The proposed method is first analyzed and refined on a controlled scenario and then it is experimentally assessed on challenging real-world tasks. The achieved results also show the ability of properly designed DeepESNs to outperform RC approaches on a speech recognition task, and to compete with the state-of-the-art in time-series prediction on polyphonic music tasks.


Subject(s)
Neural Networks, Computer , Pattern Recognition, Physiological , Speech , Humans
3.
Neural Netw ; 24(5): 440-56, 2011 Jun.
Article in English | MEDLINE | ID: mdl-21376531

ABSTRACT

Echo State Networks (ESNs) constitute an emerging approach for efficiently modeling Recurrent Neural Networks (RNNs). In this paper we investigate some of the main aspects that can be accounted for the success and limitations of this class of models. In particular, we propose complementary classes of factors related to contractivity and architecture of reservoirs and we study their relative relevance. First, we show the existence of a class of tasks for which ESN performance is independent of the architectural design. The effect of the Markovian factor, characterizing a significant class within these cases, is shown by introducing instances of easy/hard tasks for ESNs featured by contractivity of reservoir dynamics. In the complementary cases, for which architectural design is effective, we investigate and decompose the aspects of network design that allow a larger reservoir to progressively improve the predictive performance. In particular, we introduce four key architectural factors: input variability, multiple time-scales dynamics, non-linear interactions among units and regression in an augmented feature space. To investigate the quantitative effects of the different architectural factors within this class of tasks successfully approached by ESNs, variants of the basic ESN model are proposed and tested on instances of datasets of different nature and difficulty. Experimental evidences confirm the role of the Markovian factor and show that all the identified key architectural factors have a major role in determining ESN performances.


Subject(s)
Artificial Intelligence , Computer Simulation/standards , Markov Chains , Neural Networks, Computer , Algorithms , Humans , Mathematical Concepts , Nonlinear Dynamics , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...