Skip to Main content Skip to Navigation
Conference papers

Learning Submodular Losses with the Lovász Hinge

Abstract : Learning with non-modular losses is an important problem when sets of predictions are made simultaneously. The main tools for constructing convex surrogate loss functions for set prediction are margin rescaling and slack rescaling. In this work, we show that these strategies lead to tight convex surro-gates iff the underlying loss function is increasing in the number of incorrect predictions. However, gradient or cutting-plane computation for these functions is NP-hard for non-supermodular loss functions. We propose instead a novel convex surrogate loss function for submodular losses, the Lovász hinge, which leads to O(p log p) complexity with O(p) oracle accesses to the loss function to compute a gradient or cutting-plane. As a result, we have developed the first tractable convex surrogates in the literature for sub-modular losses. We demonstrate the utility of this novel convex surrogate through a real world image labeling task.
Document type :
Conference papers
Complete list of metadata

Cited literature [23 references]  Display  Hide  Download
Contributor : Matthew Blaschko Connect in order to contact the contributor
Submitted on : Wednesday, May 13, 2015 - 4:16:47 PM
Last modification on : Thursday, February 3, 2022 - 3:01:39 AM
Long-term archiving on: : Wednesday, April 19, 2017 - 11:40:18 PM


Files produced by the author(s)


  • HAL Id : hal-01151823, version 1


Jiaqian Yu, Matthew Blaschko. Learning Submodular Losses with the Lovász Hinge. International Conference on Machine Learning, Jul 2015, Lille, France. ⟨hal-01151823⟩



Record views


Files downloads