A heterosynaptic learning rule for neural networks

Research output: Contribution to journalArticleScientificpeer-review

9 Citations (Scopus)

Abstract

In this article we intoduce a novel stochastic Hebb-like learning rule for neural networks that is ueurobiologically motivated. This learning rule combines features of unsupervised (Hebbian) and supervised (reinforcement) learning and is stochastic with respect to the selection of the time points when a synapse is modified. Moreover, the learning rule does not only affect the synapse between pre- and postsynaptic neuron, which is called homosynaptic plasticity, but effects also further remote synapses of the preand postsynaptic neuron. This more complex form of synaptic plasticity has recently come under investigations in neurobiology and is called heterosynaptic plasticity. We demonstrate that this learning rule is useful in training neural networks by learning parity functions including the exclusive-or (XOR) mapping in a multilayer feed-forward network. We find, that our stochastic learning rule works well, even in the presence of noise. Importantly, the mean learning time increases with the number of patterns to be learned polynomially, indicating efficient learning.

Original languageEnglish
Pages (from-to)1501-1520
Number of pages20
JournalInternational Journal of Modern Physics C
Volume17
Issue number10
DOIs
Publication statusPublished - Oct 2006
Externally publishedYes
Publication typeA1 Journal article-refereed

Keywords

  • Biological reinforcement learning
  • Hebb-like learning
  • Heterosynaptic plasticity
  • Neural networks

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Computer Science Applications
  • Mathematical Physics
  • General Physics and Astronomy
  • Statistical and Nonlinear Physics

Fingerprint

Dive into the research topics of 'A heterosynaptic learning rule for neural networks'. Together they form a unique fingerprint.

Cite this