Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Which Hype for my New Task? Hints and Random Search for Reservoir Computing Hyperparameters

Xavier Hinaut 1 Nathan Trouvain 1
1 Mnemosyne - Mnemonic Synergy
LaBRI - Laboratoire Bordelais de Recherche en Informatique, Inria Bordeaux - Sud-Ouest, IMN - Institut des Maladies Neurodégénératives [Bordeaux]
Abstract : In learning systems, hyperparameters are parameters that are not learned but need to be set a priori. In Reservoir Computing, there are several parameters that needs to be set a priori depending on the task. Newcomers to Reservoir Computing cannot have a good intuition on which hyperparameters to tune and how to tune them. For instance, beginners often explore the reservoir sparsity, but in practice this parameter is not of high influence on performance. Most importantly, many authors keep doing suboptimal hyperparameter searches: using grid search as a tool to explore more than two hyperparameters, while restraining the spectral radius to be below unity. In this short paper, we give some suggestions, intuitions, and give a general method to find robust hyperparameters while understanding their influence on performance. We also provide a graphical interface (included in ReservoirPy) in order to make this hyperparameter search more intuitive. Finally, we discuss some potential refinements of the proposed method.
Complete list of metadata

https://hal.inria.fr/hal-03203318
Contributor : Xavier Hinaut <>
Submitted on : Tuesday, April 20, 2021 - 5:08:56 PM
Last modification on : Thursday, May 6, 2021 - 4:06:46 PM

File

Hinaut2021_ICANN_Reservoir-Ran...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03203318, version 1

Collections

Citation

Xavier Hinaut, Nathan Trouvain. Which Hype for my New Task? Hints and Random Search for Reservoir Computing Hyperparameters. 2021. ⟨hal-03203318⟩

Share

Metrics

Record views

22

Files downloads

147