Improving recurrent neural network performance using transfer entropy

Oliver Obst, Joschka Boedecker, Minoru Asada

Research output: Chapter in Book / Conference PaperConference Paperpeer-review

20 Citations (Scopus)

Abstract

![CDATA[Reservoir computing approaches have been successfully applied to a variety of tasks. An inherent problem of these approaches, is, however, their variation in performance due to fixed random initialisation of the reservoir. Self-organised approaches like intrinsic plasticity have been applied to improve reservoir quality, but do not take the task of the system into account. We present an approach to improve the hidden layer of recurrent neural networks, guided by the learning goal of the system. Our reservoir adaptation optimises the information transfer at each individual unit, dependent on properties of the information transfer between input and output of the system. Using synthetic data, we show that this reservoir adaptation improves the performance of offline echo state learning and Recursive Least Squares Online Learning.]]
Original languageEnglish
Title of host publicationInternational Conference on Neural Information Processing, ICONIP 2010: Sydney, N.S.W., November 22-25, 2010, Proceedings. Part II
PublisherSpringer
Pages193-200
Number of pages8
ISBN (Print)9783642175336
DOIs
Publication statusPublished - 2010
EventInternational Conference on Neural Information Processing -
Duration: 22 Nov 2010 → …

Publication series

Name
ISSN (Print)0302-9743

Conference

ConferenceInternational Conference on Neural Information Processing
Period22/11/10 → …

Keywords

  • information theory
  • machine learning
  • neural networks (computer science)
  • reservoir computing

Fingerprint

Dive into the research topics of 'Improving recurrent neural network performance using transfer entropy'. Together they form a unique fingerprint.

Cite this