Lagrangian relaxation for SVM feature selection

Abstract : We discuss a Lagrangian-relaxation-based heuristics for dealing with feature selection in the Support Vector Machine (SVM) framework for binary classification. In particular we embed into our objective function a weighted combination of the L1 and L0 norm of the normal to the separating hyperplane. We come out with a Mixed Binary Linear Programming problem which is suitable for a Lagrangian relaxation approach. Based on a property of the optimal multiplier setting, we apply a consolidated nonsmooth optimization ascent algorithm to solve the resulting Lagrangian dual. In the proposed approach we get, at every ascent step, both a lower bound on the optimal solution as well as a feasible solution at low computational cost. We present the results of our numerical experiments on some benchmark datasets.
Type de document :
Article dans une revue
Computers & Operations Research, 2017, 87, pp.137 - 145. 〈10.1016/j.cor.2017.06.001〉
Liste complète des métadonnées

Littérature citée [29 références]  Voir  Masquer  Télécharger

https://hal.inria.fr/hal-01666453
Contributeur : Bernard Fortz <>
Soumis le : jeudi 21 décembre 2017 - 14:26:35
Dernière modification le : dimanche 22 juillet 2018 - 15:42:01

Fichier

Gaudiosoetal-preprint.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

Collections

Citation

Manlio Gaudioso, Enrico Gorgone, Martine Labbé, Antonio Manuel Rodríguez-Chía. Lagrangian relaxation for SVM feature selection. Computers & Operations Research, 2017, 87, pp.137 - 145. 〈10.1016/j.cor.2017.06.001〉. 〈hal-01666453〉

Partager

Métriques

Consultations de la notice

159

Téléchargements de fichiers

64