J. Allwood, J. Nivre, and E. Ahlsén, On the semantics and pragmatics of linguistic feedback, Journal of semantics, vol.9, issue.1, pp.1-26, 1992.

J. N. Bailenson, K. Swinth, C. Hoyt, S. Persky, A. Dimov et al., The independent and interactive effects of embodied-agent appearance and behavior on self-report, cognitive, and behavioral markers of copresence in immersive virtual environments, Presence: Teleoperators and Virtual Environments, vol.14, issue.4, 2005.

B. D. Bartholow, M. Fabiani, G. Gratton, and B. A. Bettencourt, A psychophysiological examination of cognitive processing of and affective responses to social expectancy violations, Psychological science, vol.12, issue.3, pp.197-204, 2001.

R. ;. Bertrand and R. Espesser, Prosodic cues of back-channel signals in French conversational speech, International Conference on Prosody and Pragmatics (NWCL), 2003.

R. ;. Bertrand, . Ferré, ;. Gaëlle, P. ;. Blache, R. ;. Espesser et al., An Event-Related Potential (ERP) Analysis of Semantic Congruity and Repetition Effects in Sentences, Proceedings of Auditory-visual Speech Processing Besson M, vol.4, pp.132-181, 1992.

E. Bevacqua, M. Mancini, R. Niewiadomski, and C. Pelachaud, An expressive ECA showing complex emotions. In proceedings of AISB'07 ``Language, Speech and Gesture for Expressive Characters, pp.208-216, 2007.

E. Bevacqua, M. Mancini, C. Pelachaud, M. Chollet, M. Ochs et al., SPPAS -Multi-lingual Approaches to the Automatic Annotation of Speech, the Phonetician" -International Society of Phonetic Sciences, pp.262-269, 2008.

R. Gardner, When listeners talk. Benjamins, Amsterdam. conveying attitudes, Language Resources and Evaluation Conference (LREC), 2001.

M. Dyck, M. Winbeck, S. Leiberg, Y. Chen, R. C. Gur et al., Recognition profile of emotions in natural and virtual faces, PloS one, issue.11, p.3, 2008.

M. Eimer, The Face-Sensitive N170 Component of the Event-Related Brain Potential, Oxford Handbook of Face Perception, 2011.

F. Tree and J. E. , Listening in on monologues and dialogues. Discourse Processes, vol.27, pp.35-53, 1999.

J. Gratch, A. Okhmatovskaia, F. Lamothe, S. Marsella, M. Morales et al., Virtual rapport, International Workshop on Intelligent Virtual Agents, pp.14-27, 2006.

M. J. Herrmann, A. C. Ehlis, H. Ellgring, and A. J. Fallgatter, Early stages (P100) of face perception in humans as measured with event-related potentials (ERPs), Journal of Neural Transmission, issue.8, p.112, 2004.

M. Kutas and S. A. Hillyard, Reading senseless sentences: brain potentials reflect semantic incongruity, Science, vol.207, pp.203-204, 1980.

M. Kutas and K. D. Federmeier, Thirty years and counting: fi nding meaning in the N400 component of the event-related brain potential (ERP), Annual Review of Psychology, vol.62, pp.621-647, 2011.

A. Krombholz, F. Schaefer, and W. Boucsein, Modification of N170 by different emotional expression of schematic faces, Biological Psychology, vol.76, issue.3, pp.156-162, 2007.

H. Leuthold, R. Filik, K. Murphy, and I. G. Mackenzie, The on-line processing of socio-emotional information in prototypical scenarios: inferences from brain potentials, Social Cognitive and Affective Neuroscience, vol.7, issue.4, pp.457-466, 2011.

L. P. Morency, I. De-kok, and J. Gratch, A probabilistic multimodal approach for predicting listener backchannels, Autonomous Agents and Multi-Agent Systems, vol.20, issue.1, pp.70-84, 2010.

M. Ochs, G. De-montcheuil, J. Pergandi, J. Saubesty, C. Pelachaud et al., An architecture of virtual patient simulation platform to train doctors to break bad news, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01793374

C. Pelachaud, Studies on gesture expressivity for a virtual agent, Speech Communication, vol.51, issue.7, pp.630-639, 2009.

B. E. Penteado, M. Ochs, R. Bertrand, and P. Blache, Evaluating Temporal Predictive Features for Virtual Patients Feedbacks, Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents, 2019.
URL : https://hal.archives-ouvertes.fr/hal-02355386

R. Poppe, K. P. Truong, D. Reidsma, and D. Heylen, Mining a Multimodal Corpus of Doctor's Training for Virtual Patient's Feedbacks, International Conference on Multimodal Interaction (ICMI), pp.146-158, 2010.

L. Prévot, J. Gorish, and R. Bertrand, A CUP of CoFee: A large collection of feedback utterances provided with communicative function annotations, Language Resources and Evaluation Conference (LREC), 2016.

B. Rauchbauer, B. Nazarian, M. Bourhis, M. Ochs, L. Prévot et al., Brain activity during reciprocal social interaction investigated using conversational robots as control condition, Philosophical Transactions of the Royal Society B, vol.374, 1771.
URL : https://hal.archives-ouvertes.fr/hal-02067722

R. Righart and B. De-gelder, Context influences early perceptual analysis of faces: An electrophysiological study, Cerebral Cortex, vol.16, pp.1249-1257, 2006.

E. A. Schegloff, Discourse as an interactional achievement: some uses of ''uhhuh" and other things that come between sentences, Analyzing Discourse: Text and Talk, pp.71-93, 1982.

R. Sprengelmeyer and I. Jentzsch, Event related potentials and the perception of intensity in facial expressions, Neuropsychologia, vol.44, issue.14, pp.2899-2906, 2006.

B. A. Urgen, M. Kutas, and A. P. Saygin, Uncanny valley as a window into predictive processing in the social brain, Neuropsychologia, vol.114, pp.181-185, 2018.

H. Vilhjálmsson, N. Cantelmo, J. Cassell, N. E. Chafai, M. Kipp et al., The behavior markup language: Recent developments and challenges, International Workshop on Intelligent Virtual Agents, pp.99-111, 2007.

N. Ward and W. Tsukahara, Prosodic features which cue back-channel feedback in english and japanese, Journal of Pragmatics, vol.32, pp.1177-1207, 2000.

, Ressources Le jeu de données Brain-IHM est disponible sur l'ORTOLANG