HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Reports

Recurrent neural network weight estimation through backward tuning

Thierry Viéville 1 Xavier Hinaut 1 Thalita Drumond 1 Frédéric Alexandre 1
1 Mnemosyne - Mnemonic Synergy
LaBRI - Laboratoire Bordelais de Recherche en Informatique, Inria Bordeaux - Sud-Ouest, IMN - Institut des Maladies Neurodégénératives [Bordeaux]
Abstract : We consider another formulation of weight estimation in recurrent networks, proposing a notation for a large amount of recurrent network units that helps formulating the estimation problem. Reusing a “good old” control-theory principle, improved here using a backward-tuning numerical stabilization heuristic, we obtain a numerically stable and rather efficient second-order and distributed estimation, without any meta-parameter to adjust. The relation with existing technique is discussed at each step. The proposed method is validated using reverse engineering tasks.
Document type :
Reports
Complete list of metadata

Cited literature [61 references]  Display  Hide  Download

https://hal.inria.fr/hal-01610735
Contributor : Thierry Viéville Connect in order to contact the contributor
Submitted on : Thursday, October 5, 2017 - 9:11:21 AM
Last modification on : Friday, January 21, 2022 - 3:10:03 AM

File

RR-9100.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01610735, version 1

Citation

Thierry Viéville, Xavier Hinaut, Thalita Drumond, Frédéric Alexandre. Recurrent neural network weight estimation through backward tuning. [Research Report] RR-9100, Inria Bordeaux Sud-Ouest. 2017, pp.1-54. ⟨hal-01610735⟩

Share

Metrics

Record views

277

Files downloads

384