Storing Sequences in Binary Neural Networks with High Efficiency

Xiaoran Jiang 1, 2
Lab-STICC - Laboratoire des sciences et techniques de l'information, de la communication et de la connaissance
Abstract : Sequential structure imposed by the forward linear progression of time is omnipresent in all cognitive behaviors. This thesis proposes a novel model to store sequences of any length, scalar or vectorial, in binary neural networks. Particularly, the model that we introduce allows resolving some well known problems in sequential learning, such as error intolerance, catastrophic forgetting and the interference issue while storing complex sequences, etc. The quantity of the total sequential information that the network is able to store grows quadratically with the number of nodes. And the efficiency - the ratio between the capacity and the total amount of information consumed by the storage device - can reach around 30%. This work could be considered as an extension of the non oriented clique-based neural networks previously proposed and studied within our team. Those networks composed of binary neurons and binary connections utilize the concept of graph redundancy and sparsity in order to acquire a quadratic learning diversity. To obtain the ability to store sequences, connections are provided with orientation to form a tournament-based neural network. This is more natural biologically speak- ing, since communication between neurons is unidirectional, from axons to synapses. Any component of the network, a cluster or a node, can be revisited several times within a sequence or by multiple sequences. This allows the network to store se- quences of any length, independent of the number of clusters and only limited by the total available resource of the network. Moreover, in order to allow error correction and provide robustness to the net- work, both spatial assembly redundancy and sequential redundancy, with or without anticipation, may be combined to offer a large amount of redundancy in the activation of a node. Subsequently, a double layered structure is introduced with the purpose of accurate retrieval. The lower layer of tournament-based hetero-associative network stores sequential oriented associations between patterns. An upper auto-associative layer of mirror nodes is superposed to emphasize the co-occurrence of the elements belonging to the same pattern, in the form of a clique. This model is then extended to a hierarchical structure, which helps resolve the interference issue while storing complex sequences. This thesis also contributes in proposing and assessing new decoding rules with respect to sparse messages in order to fully benefit from the theoretical quadratic law of the learning diversity. Besides the performance aspect, the biological plausibility is also a constant concern during this thesis work.
Document type :
Complete list of metadatas

Cited literature [54 references]  Display  Hide  Download
Contributor : Bibliothèque Télécom Bretagne <>
Submitted on : Friday, October 3, 2014 - 8:38:35 AM
Last modification on : Thursday, February 6, 2020 - 2:22:10 PM
Long-term archiving on: Sunday, January 4, 2015 - 10:30:29 AM


Files produced by the author(s)


  • HAL Id : tel-01071038, version 1


Xiaoran Jiang. Storing Sequences in Binary Neural Networks with High Efficiency. Computer science. Télécom Bretagne, Université de Bretagne Occidentale, 2014. English. ⟨tel-01071038⟩



Record views


Files downloads