Skip to Main content Skip to Navigation
New interface
Conference papers

Calibrating Human-AI Collaboration: Impact of Risk, Ambiguity and Transparency on Algorithmic Bias

Abstract : Transparent Machine Learning (ML) is often argued to increase trust into predictions of algorithms however the growth of new interpretability approaches is not accompanied by a growth in studies investigating how interaction of humans and Artificial Intelligence (AI) systems benefits from transparency. The right level of transparency can increase trust in an AI system, while inappropriate levels of transparency can lead to algorithmic bias. In this study we demonstrate that depending on certain personality traits, humans exhibit different susceptibilities for algorithmic bias. Our main finding is that susceptibility to algorithmic bias significantly depends on annotators’ affinity to risk. These findings help to shed light on the previously underrepresented role of human personality in human-AI interaction. We believe that taking these aspects into account when building transparent AI systems can help to ensure more responsible usage of AI systems.
Complete list of metadata

https://hal.inria.fr/hal-03414725
Contributor : Hal Ifip Connect in order to contact the contributor
Submitted on : Thursday, November 4, 2021 - 3:57:03 PM
Last modification on : Friday, November 5, 2021 - 3:58:06 AM
Long-term archiving on: : Saturday, February 5, 2022 - 7:07:15 PM

File

 Restricted access
To satisfy the distribution rights of the publisher, the document is embargoed until : 2023-01-01

Please log in to resquest access to the document

Licence


Distributed under a Creative Commons Attribution 4.0 International License

Identifiers

Citation

Philipp Schmidt, Felix Biessmann. Calibrating Human-AI Collaboration: Impact of Risk, Ambiguity and Transparency on Algorithmic Bias. 4th International Cross-Domain Conference for Machine Learning and Knowledge Extraction (CD-MAKE), Aug 2020, Dublin, Ireland. pp.431-449, ⟨10.1007/978-3-030-57321-8_24⟩. ⟨hal-03414725⟩

Share

Metrics

Record views

13