N. Ackovska, Taxonomy of learning agents, 2010.

C. Adam, W. Johal, D. Pellier, H. Fiorino, and S. Pesty, Social Human-Robot Interaction: A??New??Cognitive and Affective Interaction-Oriented Architecture, International Conference on Social Robotics, pp.253-263, 2016.
DOI : 10.3115/981732.981733

URL : https://hal.archives-ouvertes.fr/hal-01398218

C. Becker-asano and . Wasabi, Affect simulation for agents with believable interactivity, 2008.

J. Beer, A. D. Fisk, R. , and W. A. , Toward a Framework for Levels of Robot Autonomy in Human-Robot Interaction, Journal of Human-Robot Interaction, vol.3, issue.2, p.74, 2014.
DOI : 10.5898/JHRI.3.2.Beer

C. Breazeal, Toward sociable robots, Robotics and Autonomous Systems, vol.42, issue.3-4, pp.167-175, 2003.
DOI : 10.1016/S0921-8890(02)00373-1

C. Breazeal and B. Scassellati, How to build robots that make friends and influence people, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289), pp.858-863, 1999.
DOI : 10.1109/IROS.1999.812787

G. Castellano, A. Pereira, I. Leite, A. Paiva, and P. W. Mcowan, Detecting user engagement with a robot companion using task and social interaction-based features, Proceedings of the 2009 international conference on Multimodal interfaces, ICMI-MLMI '09, pp.119-126, 2009.
DOI : 10.1145/1647314.1647336

A. R. Damasio and F. Macaluso, L'errore di Cartesio: emozione, ragione e cervello umano, 1995.

F. De-la-torre and J. Cohn, Facial expression analysis. In Visual analysis of humans, pp.377-409, 2011.

J. Dias, S. Mascarenhas, and A. Paiva, FAtiMA Modular: Towards an Agent Architecture with a Generic Appraisal Framework, Emotion Modeling, pp.44-56, 2014.
DOI : 10.1007/978-3-319-12973-0_3

P. Ekman, W. V. Friesen, and J. C. Hager, Facial action coding system (facs). A technique for the measurement of facial action, p.22, 1978.

K. Han, D. Yu, and I. Tashev, Speech emotion recognition using deep neural network and extreme learning machine, Interspeech, pp.223-227, 2014.

F. Jimenez, T. Yoshikawa, T. Furuhashi, and M. Kanoh, Abstract, Journal of Artificial Intelligence and Soft Computing Research, vol.5, issue.1, pp.51-57, 2015.
DOI : 10.1515/jaiscr-2015-0018

S. Lemaignan, M. Warnier, E. A. Sisbot, A. Clodic, A. et al., Artificial cognition for social human???robot interaction: An implementation, Artificial Intelligence, vol.247, 2016.
DOI : 10.1016/j.artint.2016.07.002

R. Lowe and K. Kiryazov, Utilizing Emotions in Autonomous Robots: An Enactive Approach, Emotion modeling, pp.76-98, 2014.
DOI : 10.1007/978-3-319-12973-0_5

P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar et al., The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Workshops, pp.94-101, 2010.
DOI : 10.1109/CVPRW.2010.5543262

D. Mccoll, A. Hong, N. Hatakeyama, G. Nejat, and B. Benhabib, A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI, Journal of Intelligent & Robotic Systems, vol.38, issue.2, pp.101-133, 2016.
DOI : 10.1111/j.1468-5914.2008.00363.x

D. Mcduff, R. Kaliouby, T. Senechal, M. Amr, J. Cohn et al., Affectiva-MIT Facial Expression Dataset (AM-FED): Naturalistic and Spontaneous Facial Expressions Collected "In-the-Wild", 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp.881-888, 2013.
DOI : 10.1109/CVPRW.2013.130

M. P. Michalowski, S. Sabanovic, and R. Simmons, A spatial model of engagement for a social robot, 9th IEEE International Workshop on Advanced Motion Control, 2006., pp.762-767, 2006.
DOI : 10.1109/AMC.2006.1631755

O. Palinko, F. Rea, G. Sandini, and A. Sciutti, Eye gaze tracking for a humanoid robot, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), pp.318-324, 2015.
DOI : 10.1109/HUMANOIDS.2015.7363561

O. Palinko, F. Rea, G. Sandini, and A. Sciutti, Robot reading human gaze: Why eye tracking is better than head tracking for human-robot collaboration, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp.5048-5054, 2016.
DOI : 10.1109/IROS.2016.7759741

Y. Pan, P. Shen, and L. Shen, Speech emotion recognition using support vector machine, International Journal of Smart Home, vol.6, issue.2, pp.101-108, 2012.

F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion et al., Scikit-learn: Machine learning in Python, Journal of Machine Learning Research, vol.12, pp.2825-2830, 2011.
URL : https://hal.archives-ouvertes.fr/hal-00650905

E. Sariyanidi, H. Gunes, and A. Cavallaro, Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.37, issue.6, pp.1113-1133, 2015.
DOI : 10.1109/TPAMI.2014.2366127

J. Schwarz, C. C. Marais, T. Leyvand, S. E. Hudson, and J. Mankoff, Combining body pose, gaze, and gesture to determine intention to interact in vision-based interfaces, Proceedings of the 32nd annual ACM conference on Human factors in computing systems, CHI '14, pp.3443-3452, 2014.
DOI : 10.1145/2556288.2556989

A. Sloman, Varieties of affect and the cogaff architecture schema The Society for the Study of, Proceedings of the AISB01 symposium on emotions, cognition, and affective computing, 2001.

A. Tanevska, F. Rea, G. Sandini, and A. Sciutti, Can emotions enhance the robot's cogntive abilities: a study in autonomous hri with an emotional robot, Proceedings of AISB Convention 2017: Symposium on Computational Modelling of emotion, 2017.

D. Vaufreydaz, W. Johal, C. , and C. , Starting engagement detection towards a companion robot using multimodal features, Robotics and Autonomous Systems, vol.75, pp.4-16, 2016.
DOI : 10.1016/j.robot.2015.01.004

URL : https://hal.archives-ouvertes.fr/hal-01122396

D. Vernon, Artificial cognitive systems: A primer, 2014.