Lagrangian relaxation for SVM feature selection

Abstract : We discuss a Lagrangian-relaxation-based heuristics for dealing with feature selection in the Support Vector Machine (SVM) framework for binary classification. In particular we embed into our objective function a weighted combination of the L1 and L0 norm of the normal to the separating hyperplane. We come out with a Mixed Binary Linear Programming problem which is suitable for a Lagrangian relaxation approach. Based on a property of the optimal multiplier setting, we apply a consolidated nonsmooth optimization ascent algorithm to solve the resulting Lagrangian dual. In the proposed approach we get, at every ascent step, both a lower bound on the optimal solution as well as a feasible solution at low computational cost. We present the results of our numerical experiments on some benchmark datasets.
Document type :
Journal articles
Complete list of metadatas

Cited literature [29 references]  Display  Hide  Download

https://hal.inria.fr/hal-01666453
Contributor : Bernard Fortz <>
Submitted on : Thursday, December 21, 2017 - 2:26:35 PM
Last modification on : Tuesday, June 18, 2019 - 3:10:22 PM

File

Gaudiosoetal-preprint.pdf
Files produced by the author(s)

Identifiers

Collections

Citation

Manlio Gaudioso, Enrico Gorgone, Martine Labbé, Antonio Manuel Rodríguez-Chía. Lagrangian relaxation for SVM feature selection. Computers and Operations Research, Elsevier, 2017, 87, pp.137 - 145. ⟨10.1016/j.cor.2017.06.001⟩. ⟨hal-01666453⟩

Share

Metrics

Record views

200

Files downloads

158