A Convex Surrogate Operator for General Non-Modular Loss Functions

Abstract : In this work, a novel generic convex surrogate for general non-modular loss functions is introduced, which provides for the first time a tractable solution for loss functions that are neither supermodular nor submodular. This convex surrogate is based on a submodular-supermodular decomposition. It takes the sum of two convex surrogates that separately bound the supermodular component and the submodular component using slack-rescaling and the Lovász hinge, respectively. This surrogate is convex, piecewise linear, an extension of the loss function, and for which subgradient computation is polynomial time. Empirical results are reported on a non-submodular loss based on the Sørensen-Dice difference function demonstrating the improved performance, efficiency, and scalabil-ity of the novel convex surrogate.
Document type :
Conference papers
Complete list of metadatas

https://hal.inria.fr/hal-01417108
Contributor : Jiaqian Yu <>
Submitted on : Thursday, December 15, 2016 - 12:06:02 PM
Last modification on : Thursday, February 21, 2019 - 10:31:47 AM
Long-term archiving on : Thursday, March 16, 2017 - 5:30:30 PM

File

benelearn2016.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01417108, version 1

Citation

Jiaqian Yu, Matthew Blaschko. A Convex Surrogate Operator for General Non-Modular Loss Functions. The 25th Belgian-Dutch Conference on Machine Learning, Sep 2016, Kortrijk, Belgium. ⟨hal-01417108⟩

Share

Metrics

Record views

377

Files downloads

52