PLLay: Efficient Topological Layer based on Persistence Landscapes - Archive ouverte HAL Access content directly
Conference Papers Year :

PLLay: Efficient Topological Layer based on Persistence Landscapes

(1) , (2) , (3) , (1) , (2) , (1)
1
2
3
Kwangho Kim
  • Function : Author
  • PersonId : 1088568
Jisu Kim
  • Function : Author
  • PersonId : 1060470
Manzil Zaheer
  • Function : Author
  • PersonId : 1088569
Joon Sik Kim
  • Function : Author
  • PersonId : 1088570
Frédéric Chazal

Abstract

We propose PLLay, a novel topological layer for general deep learning models based on persistence landscapes, in which we can efficiently exploit the underlying topological features of the input data structure. In this work, we show differentiability with respect to layer inputs, for a general persistent homology with arbitrary filtration. Thus, our proposed layer can be placed anywhere in the network and feed critical information on the topological features of input data into subsequent layers to improve the learnability of the networks toward a given task. A task-optimal structure of PLLay is learned during training via backpropagation, without requiring any input featurization or data preprocessing. We provide a novel adaptation for the DTM function-based filtration, and show that the proposed layer is robust against noise and outliers through a stability analysis. We demonstrate the effectiveness of our approach by classification experiments on various datasets.
Fichier principal
Vignette du fichier
neurips_2020.pdf (1.08 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03111277 , version 1 (15-01-2021)

Identifiers

Cite

Kwangho Kim, Jisu Kim, Manzil Zaheer, Joon Sik Kim, Frédéric Chazal, et al.. PLLay: Efficient Topological Layer based on Persistence Landscapes. NeurIPS 2020 - 34th Conference on Neural Information Processing Systems, Dec 2020, Vancouver / Virtuel, Canada. ⟨hal-03111277⟩
106 View
205 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More