Recurrent Kernel Networks

Dexiong Chen 1 Laurent Jacob 2 Julien Mairal 1
1 Thoth - Apprentissage de modèles à partir de données massives
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann
2 Statistique en grande dimension pour la génomique
PEGASE - Département PEGASE [LBBE]
Abstract : Substring kernels are classical tools for representing biological sequences or text. However, when large amounts of annotated data are available, models that allow end-to-end training such as neural networks are often preferred. Links between recurrent neural networks (RNNs) and substring kernels have recently been drawn, by formally showing that RNNs with specific activation functions were points in a reproducing kernel Hilbert space (RKHS). In this paper, we revisit this link by generalizing convolutional kernel networks—originally related to a relaxation of the mismatch kernel—to model gaps in sequences. It results in a new type of recurrent neural network which can be trained end-to-end with backpropagation, or without supervision by using kernel approximation techniques. We experimentally show that our approach is well suited to biological sequences, where it outperforms existing methods for protein classification tasks.
Complete list of metadatas

Cited literature [41 references]  Display  Hide  Download

https://hal.inria.fr/hal-02151135
Contributor : Dexiong Chen <>
Submitted on : Thursday, October 17, 2019 - 3:47:23 PM
Last modification on : Friday, October 18, 2019 - 1:20:22 PM

File

neurips_2019.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02151135, version 2
  • ARXIV : 1906.03200

Collections

Citation

Dexiong Chen, Laurent Jacob, Julien Mairal. Recurrent Kernel Networks. NeurIPS 2019 - Thirty-third Conference Neural Information Processing Systems, Dec 2019, Vancouver, Canada. ⟨hal-02151135v2⟩

Share

Metrics

Record views

50

Files downloads

108