Skip to Main content Skip to Navigation
Conference papers

PLLay: Efficient Topological Layer based on Persistence Landscapes

Abstract : We propose PLLay, a novel topological layer for general deep learning models based on persistence landscapes, in which we can efficiently exploit the underlying topological features of the input data structure. In this work, we show differentiability with respect to layer inputs, for a general persistent homology with arbitrary filtration. Thus, our proposed layer can be placed anywhere in the network and feed critical information on the topological features of input data into subsequent layers to improve the learnability of the networks toward a given task. A task-optimal structure of PLLay is learned during training via backpropagation, without requiring any input featurization or data preprocessing. We provide a novel adaptation for the DTM function-based filtration, and show that the proposed layer is robust against noise and outliers through a stability analysis. We demonstrate the effectiveness of our approach by classification experiments on various datasets.
Complete list of metadata

https://hal.inria.fr/hal-03111277
Contributor : Jisu Kim Connect in order to contact the contributor
Submitted on : Friday, January 15, 2021 - 11:22:12 AM
Last modification on : Saturday, May 1, 2021 - 3:41:35 AM
Long-term archiving on: : Friday, April 16, 2021 - 6:23:58 PM

File

neurips_2020.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03111277, version 1
  • ARXIV : 2002.02778

Citation

Kwangho Kim, Jisu Kim, Manzil Zaheer, Joon Kim, Frédéric Chazal, et al.. PLLay: Efficient Topological Layer based on Persistence Landscapes. NeurIPS 2020 - 34th Conference on Neural Information Processing Systems, Dec 2020, Vancouver / Virtuel, Canada. ⟨hal-03111277⟩

Share

Metrics

Les métriques sont temporairement indisponibles