Rényi Divergence and Kullback-Leibler Divergence

Tim Van Erven 1, 2 Peter Harremoës 3
1 SELECT - Model selection in statistical learning
Inria Saclay - Ile de France, LMO - Laboratoire de Mathématiques d'Orsay, CNRS - Centre National de la Recherche Scientifique : UMR
Abstract : Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibler divergence. We review and extend the most important properties of Rényi divergence and Kullback-Leibler divergence, including convexity, continuity, limits of {\sigma}-algebras and the relation of the special order 0 to the Gaussian dichotomy and contiguity. We also extend the known equivalence between channel capacity and minimax redundancy to continuous channel inputs (for all orders), and present several other minimax results.
Type de document :
Pré-publication, Document de travail
Accepted by IEEE Transactions on Information Theory. To appear. 2013
Liste complète des métadonnées

https://hal.inria.fr/hal-00758191
Contributeur : Tim Van Erven <>
Soumis le : mercredi 28 novembre 2012 - 11:54:41
Dernière modification le : samedi 10 février 2018 - 01:00:56

Lien texte intégral

Identifiants

  • HAL Id : hal-00758191, version 1
  • ARXIV : 1206.2459

Collections

Citation

Tim Van Erven, Peter Harremoës. Rényi Divergence and Kullback-Leibler Divergence. Accepted by IEEE Transactions on Information Theory. To appear. 2013. 〈hal-00758191〉

Partager

Métriques

Consultations de la notice

652