Recurrent Kernel Networks

Dexiong Chen 1 Laurent Jacob 2 Julien Mairal 1
1 Thoth - Apprentissage de modèles à partir de données massives
LJK - Laboratoire Jean Kuntzmann, Inria Grenoble - Rhône-Alpes
2 Statistique en grande dimension pour la génomique
PEGASE - Département PEGASE [LBBE]
Abstract : Substring kernels are classical tools for representing biological sequences or text. However, when large amounts of annotated data is available, models that allow end-to-end training such as neural networks are often prefered. Links between recurrent neural networks (RNNs) and substring kernels have recently been drawn, by formally showing that RNNs with specific activation functions were points in a reproducing kernel Hilbert space (RKHS). In this paper, we revisit this link by generalizing convolutional kernel networks-originally related to a relaxation of the mismatch kernel to model gaps in sequences. It results in a new type of recurrent neural network which can be trained end-to-end with backpropagation, or without supervision by using kernel approximation techniques. We experimentally show that our approach is well suited to biological sequences, where it outperforms existing methods for protein classification tasks.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

Cited literature [20 references]  Display  Hide  Download

https://hal.inria.fr/hal-02151135
Contributor : Dexiong Chen <>
Submitted on : Friday, June 7, 2019 - 5:12:03 PM
Last modification on : Monday, June 17, 2019 - 3:30:57 PM

File

main.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02151135, version 1

Collections

Citation

Dexiong Chen, Laurent Jacob, Julien Mairal. Recurrent Kernel Networks. 2019. ⟨hal-02151135⟩

Share

Metrics

Record views

191

Files downloads

341