A Convex Surrogate Operator for General Non-Modular Loss Functions

Abstract : In this work, a novel generic convex surrogate for general non-modular loss functions is introduced, which provides for the first time a tractable solution for loss functions that are neither supermodular nor submodular. This convex surrogate is based on a submodular-supermodular decomposition. It takes the sum of two convex surrogates that separately bound the supermodular component and the submodular component using slack-rescaling and the Lovász hinge, respectively. This surrogate is convex, piecewise linear, an extension of the loss function, and for which subgradient computation is polynomial time. Empirical results are reported on a non-submodular loss based on the Sørensen-Dice difference function demonstrating the improved performance, efficiency, and scalabil-ity of the novel convex surrogate.
Type de document :
Communication dans un congrès
The 25th Belgian-Dutch Conference on Machine Learning, Sep 2016, Kortrijk, Belgium
Liste complète des métadonnées

https://hal.inria.fr/hal-01417108
Contributeur : Jiaqian Yu <>
Soumis le : jeudi 15 décembre 2016 - 12:06:02
Dernière modification le : vendredi 6 avril 2018 - 13:32:01
Document(s) archivé(s) le : jeudi 16 mars 2017 - 17:30:30

Fichier

benelearn2016.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01417108, version 1

Citation

Jiaqian Yu, Matthew Blaschko. A Convex Surrogate Operator for General Non-Modular Loss Functions. The 25th Belgian-Dutch Conference on Machine Learning, Sep 2016, Kortrijk, Belgium. 〈hal-01417108〉

Partager

Métriques

Consultations de la notice

236

Téléchargements de fichiers

35