Skip to Main content Skip to Navigation
Conference papers

Robot Emotional State through Bayesian Visuo-Auditory Perception

Abstract : In this paper we focus on auditory analysis as the sensory stimulus, and on vocalization synthesis as the output signal. Our scenario is to have one robot interacting with one human through vocalization channel. Notice that vocalization is far beyond speech; while speech analysis would give us what was said, vocalization analysis gives us how was said. A social robot shall be able to perform actions in different manners according to its emotional state. Thus we propose a novel Bayesian approach to determine the emotional state the robot shall assume according to how the interlocutor is talking to it. Results shows that the classification happens as expected converging to the correct decision after two iterations.
Document type :
Conference papers
Complete list of metadatas

Cited literature [16 references]  Display  Hide  Download

https://hal.inria.fr/hal-01566595
Contributor : Hal Ifip <>
Submitted on : Friday, July 21, 2017 - 11:25:49 AM
Last modification on : Friday, July 21, 2017 - 11:30:44 AM

File

978-3-642-19170-1_18_Chapter.p...
Files produced by the author(s)

Licence


Distributed under a Creative Commons Attribution 4.0 International License

Identifiers

Citation

José Prado, Carlos Simplício, Jorge Dias. Robot Emotional State through Bayesian Visuo-Auditory Perception. 2nd Doctoral Conference on Computing, Electrical and Industrial Systems (DoCEIS), Feb 2011, Costa de Caparica, Portugal. pp.165-172, ⟨10.1007/978-3-642-19170-1_18⟩. ⟨hal-01566595⟩

Share

Metrics

Record views

75

Files downloads

144