Abstract
We study the influence of the topology of a neural network on its learning dynamics. The network topology can be controlled by one parameter prw to convert the topology from regular to random in a continuous way [D.J. Watts and S.H. Strogatz, Collective dynamics of small-world networks, Nature 393 (1998) 440-442]. As test problem, which requires a recurrent network, we choose the problem of timing to be learned by the network, that means to connect a predefined input neuron with a output neuron in exactly Tf time steps. We analyze the learning dynamics for different parameters numerically by counting the number of paths within the network which are available for solving the problem. Our results show, that there are parameter values for which either a regular, small-world or random network gives the best performance depending strongly on the choice for the predefined input and output neurons.
Original language | English |
---|---|
Pages (from-to) | 1179-1182 |
Number of pages | 4 |
Journal | Neurocomputing |
Volume | 69 |
Issue number | 10-12 |
DOIs | |
Publication status | Published - May 2006 |
Externally published | Yes |
Publication type | A1 Journal article-refereed |
Keywords
- Learning dynamics
- Neural network
- Small-world network
ASJC Scopus subject areas
- Artificial Intelligence
- Cellular and Molecular Neuroscience