Skip to Main content Skip to Navigation
Journal articles

Demographic Bias in Biometrics: A Survey on an Emerging Challenge

Abstract : Systems incorporating biometric technologies have become ubiquitous in personal, commercial, and governmental identity management applications. Both cooperative (e.g., access control) and noncooperative (e.g., surveillance and forensics) systems have benefited from biometrics. Such systems rely on the uniqueness of certain biological or behavioral characteristics of human beings, which enable for individuals to be reliably recognized using automated algorithms. Recently, however, there has been a wave of public and academic concerns regarding the existence of systemic bias in automated decision systems (including biometrics). Most prominently, face recognition algorithms have often been labeled as "racist" or "biased" by the media, nongovernmental organizations, and researchers alike. The main contributions of this article are: 1) an overview of the topic of algorithmic bias in the context of biometrics; 2) a comprehensive survey of the existing literature on biometric bias estimation and mitigation; 3) a discussion of the pertinent technical and social matters; and 4) an outline of the remaining challenges and future work items, both from technological and social points of view.
Complete list of metadata
Contributor : Antitza Dantcheva Connect in order to contact the contributor
Submitted on : Friday, February 19, 2021 - 11:35:37 AM
Last modification on : Wednesday, November 3, 2021 - 8:08:42 AM
Long-term archiving on: : Thursday, May 20, 2021 - 6:34:18 PM


Files produced by the author(s)




Pawel Drozdowski, Christian Rathgeb, Antitza Dantcheva, Naser Damer, Christoph Busch. Demographic Bias in Biometrics: A Survey on an Emerging Challenge. IEEE Transactions on Technology and Society, IEEE, 2020, ⟨10.1109/TTS.2020.2992344⟩. ⟨hal-03146646⟩



Les métriques sont temporairement indisponibles