Skip to Main content Skip to Navigation
New interface
Conference papers

A Convex Surrogate Operator for General Non-Modular Loss Functions

Abstract : In this work, a novel generic convex surrogate for general non-modular loss functions is introduced, which provides for the first time a tractable solution for loss functions that are neither supermodular nor submodular. This convex surrogate is based on a submodular-supermodular decomposition. It takes the sum of two convex surrogates that separately bound the supermodular component and the submodular component using slack-rescaling and the Lovász hinge, respectively. This surrogate is convex, piecewise linear, an extension of the loss function, and for which subgradient computation is polynomial time. Empirical results are reported on a non-submodular loss based on the Sørensen-Dice difference function demonstrating the improved performance, efficiency, and scalabil-ity of the novel convex surrogate.
Document type :
Conference papers
Complete list of metadata
Contributor : Jiaqian Yu Connect in order to contact the contributor
Submitted on : Thursday, December 15, 2016 - 12:06:02 PM
Last modification on : Thursday, February 3, 2022 - 3:01:40 AM
Long-term archiving on: : Thursday, March 16, 2017 - 5:30:30 PM


Files produced by the author(s)


  • HAL Id : hal-01417108, version 1


Jiaqian B Yu, Matthew Blaschko. A Convex Surrogate Operator for General Non-Modular Loss Functions. The 25th Belgian-Dutch Conference on Machine Learning, Sep 2016, Kortrijk, Belgium. ⟨hal-01417108⟩



Record views


Files downloads