%0 Journal Article %T Re-encoding of associations by recurrent plasticity increases memory capacity %A Daniel Medina %A Christian Leibold %J Frontiers in Synaptic Neuroscience %D 2014 %I Frontiers Media %R 10.3389/fnsyn.2014.00013 %X Recurrent networks have been proposed as a model of associative memory. In such models, memory items are stored in the strength of connections between neurons. These modifiable connections or synapses constitute a shared resource among all stored memories, limiting the capacity of the network. Synaptic plasticity at different time scales can play an important role in optimizing the representation of associative memories, by keeping them sparse, uncorrelated and non-redundant. Here, we use a model of sequence memory to illustrate how plasticity allows a recurrent network to self-optimize by gradually re-encoding the representation of its memory items. A learning rule is used to sparsify large patterns, i.e., patterns with many active units. As a result, pattern sizes become more homogeneous, which increases the network's dynamical stability during sequence recall and allows more patterns to be stored. Last, we show that the learning rule allows for online learning in that it keeps the network in a robust dynamical steady state while storing new memories and overwriting old ones. %K associative memory %K memory capacity %K sparse coding %K recurrent plasticity %K memory consolidation %U http://www.frontiersin.org/Journal/10.3389/fnsyn.2014.00013/abstract