The Lovász Hinge: A Novel Convex Surrogate for Submodular Losses

Abstract : Learning with non-modular losses is an important problem when sets of predictions are made simultaneously. The main tools for constructing convex surrogate loss functions for set prediction are margin rescaling and slack rescaling. In this work, we show that these strategies lead to tight convex surrogates iff the underlying loss function is increasing in the number of incorrect predictions. However, gradient or cutting-plane computation for these functions is NP-hard for non-supermodular loss functions. We propose instead a novel surrogate loss function for submodular losses, the Lovász hinge, which leads to O(p log p) complexity with O(p) oracle accesses to the loss function to compute a gradient or cutting-plane. We prove that the Lovász hinge is convex and yields an extension. As a result, we have developed the first tractable convex surrogates in the literature for submodular losses. We demonstrate the utility of this novel convex surrogate through several set prediction tasks, including on the PASCAL VOC and Microsoft COCO datasets.
Type de document :
Pré-publication, Document de travail
2017
Liste complète des métadonnées

https://hal.inria.fr/hal-01241626
Contributeur : Jiaqian Yu <>
Soumis le : lundi 15 mai 2017 - 10:27:07
Dernière modification le : mardi 17 avril 2018 - 09:05:14
Document(s) archivé(s) le : jeudi 17 août 2017 - 00:31:24

Fichiers

arXiv2017v2.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01241626, version 2
  • ARXIV : 1512.07797

Citation

Jiaqian Yu, Matthew Blaschko. The Lovász Hinge: A Novel Convex Surrogate for Submodular Losses. 2017. 〈hal-01241626v2〉

Partager

Métriques

Consultations de la notice

227

Téléchargements de fichiers

136