Reservoir optimization in recurrent neural networks using properties of Kronecker product

Logic Journal of the IGPL 18 (5):670-685 (2010)
  Copy   BIBTEX

Abstract

Recurrent neural networks based on reservoir computing are increasingly being used in many applications. Optimization of the topological structure of the reservoir and the internal connection weights for a given task is one of the most important problems in reservoir computing. In this paper, considering the fact that one can construct a large matrix using Kronecker products of several small-size matrices, we propose a method to optimize the reservoir. Having a small number of parameters to optimize, a gradient based algorithm is applied to optimize parameters, and consequently the reservoir. In addition to reducing the number of parameters for optimization, potentially, the method is able to control several other properties of the reservoir such as spectral radius, sparsity, weight distribution and underlying connections, i.e. connection topology. To reveal the effectiveness of the proposed optimization method, the application to the following tasks are considered: Nonlinear autoregressive moving average and multiple superimposed oscillators. Simulation results show satisfactory performance of the method

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 93,779

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Structure optimization of reservoir networks.Benjamin Roeschies & Christian Igel - 2010 - Logic Journal of the IGPL 18 (5):635-669.

Analytics

Added to PP
2015-02-04

Downloads
20 (#757,502)

6 months
2 (#1,445,278)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

References found in this work

No references found.

Add more references