Recurrent neural network weight estimation through backward tuning

Thierry Viéville 1 Xavier Hinaut 1 Thalita Drumond 1 Frédéric Alexandre 1
1 Mnemosyne - Mnemonic Synergy
LaBRI - Laboratoire Bordelais de Recherche en Informatique, Inria Bordeaux - Sud-Ouest, IMN - Institut des Maladies Neurodégénératives [Bordeaux]
Abstract : We consider another formulation of weight estimation in recurrent networks, proposing a notation for a large amount of recurrent network units that helps formulating the estimation problem. Reusing a “good old” control-theory principle, improved here using a backward-tuning numerical stabilization heuristic, we obtain a numerically stable and rather efficient second-order and distributed estimation, without any meta-parameter to adjust. The relation with existing technique is discussed at each step. The proposed method is validated using reverse engineering tasks.
Document type :
Reports
Complete list of metadatas

Cited literature [61 references]  Display  Hide  Download

https://hal.inria.fr/hal-01610735
Contributor : Thierry Viéville <>
Submitted on : Thursday, October 5, 2017 - 9:11:21 AM
Last modification on : Wednesday, April 3, 2019 - 1:58:31 AM

File

RR-9100.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01610735, version 1

Citation

Thierry Viéville, Xavier Hinaut, Thalita Drumond, Frédéric Alexandre. Recurrent neural network weight estimation through backward tuning. [Research Report] RR-9100, Inria Bordeaux Sud-Ouest. 2017, pp.1-54. ⟨hal-01610735⟩

Share

Metrics

Record views

328

Files downloads

400