Non-Smooth Regularization: Improvement to Learning Framework through Extrapolation - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Signal Processing Année : 2022

Non-Smooth Regularization: Improvement to Learning Framework through Extrapolation

Résumé

Deep learning architectures employ various regularization terms to handle different types of priors. Non-smooth regularization terms have shown promising performance in the deep learning architectures and a learning framework has recently been proposed to train autoencoders with such regularization terms. While this framework efficiently manages the non-smooth term during training through proximal operators, it is limited to autoencoders and suffers from low convergence speed due to several optimization sub-problems that must be solved in a row. In this paper, we address these issues by extending the framework to general feed-forward neural networks and introducing variable extrapolation which can dramatically increase the convergence speed in each sub-problem. We show that the proposed update rules converge to a critical point of the objective function under mild conditions. To compare the resulting framework with the previously proposed one, we consider the problem of training sparse autoencoders and robustifying deep neural architectures against both targeted and untargeted attacks. Simulations show superior performance in both convergence speed and final objective function value.
Fichier principal
Vignette du fichier
Amini_TSP_2021.pdf (1.04 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03586153 , version 1 (23-02-2022)

Identifiants

Citer

Sajjad Amini, Mohammad Soltanian, Mostafa Sadeghi, Shahrokh Ghaemmaghami. Non-Smooth Regularization: Improvement to Learning Framework through Extrapolation. IEEE Transactions on Signal Processing, 2022, 70, pp.1213 - 1223. ⟨10.1109/TSP.2022.3154969⟩. ⟨hal-03586153⟩
107 Consultations
179 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More