Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Variance Based Samples Weighting for Supervised Deep Learning

Abstract : In the context of supervised learning of a function by a Neural Network (NN), we claim and empirically justify that a NN yields better results when the distribution of the data set focuses on regions where the function to learn is steeper. We first traduce this assumption in a mathematically workable way using Taylor expansion. Then, theoretical derivations allow to construct a methodology that we call Variance Based Samples Weighting (VBSW). VBSW uses local variance of the labels to weight the training points. This methodology is general, scalable, cost effective, and significantly increases the performances of a large class of NNs for various classification and regression tasks on image, text and multivariate data. We highlight its benefits with experiments involving NNs from shallow linear NN to Resnet or Bert.
Complete list of metadata
Contributor : Paul Novello Connect in order to contact the contributor
Submitted on : Thursday, January 28, 2021 - 10:52:13 AM
Last modification on : Saturday, May 1, 2021 - 3:39:37 AM


  • HAL Id : hal-02885827, version 3
  • ARXIV : 2101.07561


Paul Novello, Gaël Poëtte, David Lugato, Pietro Congedo. Variance Based Samples Weighting for Supervised Deep Learning. 2021. ⟨hal-02885827v3⟩



Les métriques sont temporairement indisponibles