Learning Submodular Losses with the Lovász Hinge

Abstract : Learning with non-modular losses is an important problem when sets of predictions are made simultaneously. The main tools for constructing convex surrogate loss functions for set prediction are margin rescaling and slack rescaling. In this work, we show that these strategies lead to tight convex surro-gates iff the underlying loss function is increasing in the number of incorrect predictions. However, gradient or cutting-plane computation for these functions is NP-hard for non-supermodular loss functions. We propose instead a novel convex surrogate loss function for submodular losses, the Lovász hinge, which leads to O(p log p) complexity with O(p) oracle accesses to the loss function to compute a gradient or cutting-plane. As a result, we have developed the first tractable convex surrogates in the literature for sub-modular losses. We demonstrate the utility of this novel convex surrogate through a real world image labeling task.
Document type :
Conference papers
Liste complète des métadonnées

Cited literature [23 references]  Display  Hide  Download

https://hal.inria.fr/hal-01151823
Contributor : Matthew Blaschko <>
Submitted on : Wednesday, May 13, 2015 - 4:16:47 PM
Last modification on : Thursday, February 7, 2019 - 5:29:18 PM
Document(s) archivé(s) le : Wednesday, April 19, 2017 - 11:40:18 PM

File

YU_ICML2015.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01151823, version 1

Citation

Jiaqian Yu, Matthew Blaschko. Learning Submodular Losses with the Lovász Hinge. International Conference on Machine Learning, Jul 2015, Lille, France. ⟨hal-01151823⟩

Share

Metrics

Record views

267

Files downloads

856