# A Bayesian Neural Network based on Dropout Regulation

1 ORPAILLEUR - Knowledge representation, reasonning
Inria Nancy - Grand Est, LORIA - NLPKD - Department of Natural Language Processing & Knowledge Discovery
Abstract : Bayesian Neural Networks (BNN) have recently emerged in the Deep Learning world for dealing with uncertainty estimation in classification tasks, and are used in many application domains such as astrophysics, autonomous driving... BNN assume a prior over the weights of a neural network instead of point estimates, enabling in this way the estimation of both aleatoric and epistemic uncertainty of the model prediction. Moreover, a particular type of BNN, namely MC Dropout, assumes a Bernoulli distribution on the weights by using Dropout. Several attempts to optimize the dropout rate exist, e.g. using a variational approach. In this paper, we present a new method called Dropout Regulation'' (DR), which consists of automatically adjusting the dropout rate during training using a controller as used in automation. DR allows for a precise estimation of the uncertainty which is comparable to the state-of-the-art while remaining simple to implement.
Keywords :
Document type :
Conference papers
Domain :

https://hal.inria.fr/hal-03122764
Contributor : Claire Theobald Connect in order to contact the contributor
Submitted on : Tuesday, February 2, 2021 - 11:47:02 AM
Last modification on : Wednesday, November 3, 2021 - 7:57:40 AM
Long-term archiving on: : Monday, May 3, 2021 - 6:08:48 PM

### Files

ct-etal-wuml20.pdf
Files produced by the author(s)

### Identifiers

• HAL Id : hal-03122764, version 1
• ARXIV : 2102.01968

### Citation

Claire Theobald, Frédéric Pennerath, Brieuc Conan-Guez, Miguel Couceiro, Amedeo Napoli. A Bayesian Neural Network based on Dropout Regulation. Workshop on Uncertainty in Machine Learning (WUML) at ECML-PKDD 2020 Conference, Eyke Hüllermeier; Sébastien Destercke, Sep 2020, N.A. (online), France. ⟨hal-03122764⟩

### Metrics

Les métriques sont temporairement indisponibles