Which Hype for my New Task? Hints and Random Search for Reservoir Computing Hyperparameters - Archive ouverte HAL Access content directly
Conference Papers Year :

Which Hype for my New Task? Hints and Random Search for Reservoir Computing Hyperparameters

(1) , (1)
1
Xavier Hinaut
Nathan Trouvain
  • Function : Author

Abstract

In learning systems, hyperparameters are parameters that are not learned but need to be set a priori. In Reservoir Computing, there are several parameters that needs to be set a priori depending on the task. Newcomers to Reservoir Computing cannot have a good intuition on which hyperparameters to tune and how to tune them. For instance, beginners often explore the reservoir sparsity, but in practice this parameter is not of high influence on performance for ESNs. Most importantly, many authors keep doing suboptimal hyperparameter searches: using grid search as a tool to explore more than two hyperparameters, while restraining the spectral radius to be below unity. In this short paper, we give some suggestions, intuitions, and give a general method to find robust hyperparameters while understanding their influence on perfor- mance. We also provide a graphical interface (included in ReservoirPy) in order to make this hyperparameter search more intuitive. Finally, we discuss some potential refinements of the proposed method.
Fichier principal
Vignette du fichier
Hinaut2021_ICANN_Reservoir-Random-Search_HAL-preprint-v2.pdf (1.91 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03203318 , version 1 (20-04-2021)
hal-03203318 , version 2 (15-12-2021)

Identifiers

  • HAL Id : hal-03203318 , version 2

Cite

Xavier Hinaut, Nathan Trouvain. Which Hype for my New Task? Hints and Random Search for Reservoir Computing Hyperparameters. ICANN 2021 - 30th International Conference on Artificial Neural Networks, Sep 2021, Bratislava, Slovakia. ⟨hal-03203318v2⟩

Collections

CNRS INRIA INRIA2
236 View
247 Download

Share

Gmail Facebook Twitter LinkedIn More