A. A. Abe86, Functions of gaze in social interaction: Communication and monitoring, Journal of Nonverbal Behavior, vol.10, issue.2 9, pp.83-101, 1986.

A. M. Cook-m, Gaze and Mutual Gaze, 1976.

[. I. and H. J. Seidel-h.-p, Automatic generation of non-verbal facial expressions from speech, Proc. Computer Graphics International, pp.283-293, 2002.

]. Ama14, . Amazon, and . Com, Amazon mechanical turk, p.14

[. S. Mutlu-b, Conversational gaze aversion for virtual agents, In Intelligent Virtual Agents Lecture Notes in Computer Science, vol.8108, issue.15, pp.249-262, 2013.

A. J. And81, Ocular torsion in the cat after lesions of the interstitial nucleus of cajal, Annals of the New York Academy of Sciences, vol.374, issue.1, pp.865-871, 1981.

A. S. Apmg12a, . Pejsa-t, and . Mutlu-b, Designing effective gaze mechanisms for virtual agents, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (2012), CHI '12, pp.705-714

A. S. Apmg12b, . Pejsa-t, and . Mutlu-b, A head-eye coordination model for animating gaze shifts of virtual characters, Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction (2012), Gaze-In '12, pp.1-4

[. S. , T. X. Gleicher-m, and . Mutlu-b, Conversational gaze aversion for humanlike robots, Proceedings of HRI 2014, p.12, 2014.

[. C. Wales-a and . Horne-j, PVT lapses differ according to eyes open, closed, or looking away, Sleep, vol.3, pp.197-200, 2010.

A. S. Bentin, . Puce-a, . Perez-e, and . G. Mc-carthy, Electrophysiological Studies of Face Perception in Humans, Journal of Cognitive Neuroscience, vol.87, issue.6, pp.551-565, 1996.
DOI : 10.1126/science.1598577

. [. Adler-d and . Stark-l, Most naturally occurring human saccades have magnitudes of 15 degrees or less, Investigative Ophthalmology & Visual Science, vol.14, issue.6, pp.468-477, 1975.

B. [. Beall-a and . M. Loomis-j, Equilibrium theory revisited: Mutual gaze and personal space in virtual environments, Presence: Teleoperators & Virtual Environments, vol.10, pp.6-583, 2001.

[. J. Beall-a, . M. Loomis-j, and . N. Bailenson-j, Interpersonal distance in immersive virtual environments, Personality and Social Psychology Bulletin, vol.29, pp.1-15, 2002.

. Bbw96, . J. Burgoon, . Buller-d, and . Woodall-w, Nonverbal Communication: The Unspoken Dialogue, 1996.

C. A. Bahill and . Stark-l, The main sequence, a tool for studying human eye movements, Mathematical Biosciences, vol.24, issue.3-4, pp.191-204, 1975.
DOI : 10.1016/0025-5564(75)90075-9

K. Ruhland, S. Andrist, J. B. Badler, C. E. Peters, N. I. Badler et al., Look me in the eyes [BDG * 07 Rigid head motion in expressive speech animation: Analysis and synthesis Learning expressive human-like head motion sequences from speech, Data-Driven 3D Facial Animation, pp.1075-1086, 2007.

]. Bec89 and . W. Becker, The neurobiology of saccadic eye movements . metrics, Reviews of oculomotor research, vol.3, pp.13-67, 1989.

B. E. Bev13, A survey of listener behavior and listener models for embodied conversational agents, Coverbal Synchrony in Human-Machine Interaction, pp.243-268, 2013.

. M. Bfj-*-05-]-bennewitz, J. D. Faber-f, . Schreiber-m, and . Behnke-s, Integrating vision and speech for conversations with multiple persons, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2005), pp.2523-2528

B. D. Horvitz-e, Facilitating multiparty dialog with gaze, gesture, and speech, International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction ICMI-MLMI '10, pp.1-5, 2010.

[. E. , H. R. Arai-m, and F. M. , Referring and gaze alignment: accessibility is alive and well in situated dialogue, Proceedings of CogSci 2009 Cognitive Science Society, pp.1246-1251, 2009.

[. W. , H. J. , K. E. , S. B. Brennan-s, H. J. Zelinsky-g et al., The benefits of interactions with physically present robots over video-displayed agents Eye gaze cues for coordination in collaborative tasks, DUET 2012 Workshop: Dual eye tracking in CSCE. 2012 ACM Conference on Computer Supported Cooperative Work, pp.41-52, 2011.

B. , B. M. Kwang-t, and G. S. , Amazon's mechanical turk: A new source of inexpensive, yet highquality , data?, Perspectives on Psychological Science, vol.6, issue.1, pp.3-5, 2011.

B. M. Miccoli-l, E. M. , and L. P. , The pupil as a measure of emotional arousal and autonomic activation, Psychophysiology, vol.45, issue.4, pp.602-607, 2008.

B. , B. N. Pollock-c, and A. E. Walker-m, Bossy or wimpy: expressing social dominance by combining gaze and linguistic behaviors, Proceedings of the 10th international conference on Intelligent virtual agents IVA'10, pp.265-271, 2010.

B. G. , P. D. , G. C. Panaget-f, and . Bretier-p, Modeling gaze behavior for a 3d eca in a dialogue situation, Gesture in Human-Computer Interaction and Simulation, pp.252-255, 2006.

. N. Bee, A. E. Wagner-j, C. F. Vogt-t, and P. D. , Gaze behavior during interaction with a virtual character in interactive storytelling, AAMAS 2010 Workshop on Interacting with ECAs as Virtual Characters, 2010.

C. M. Car78, The Role of Gaze in the Initiation of Conversation, Social Psychology, vol.41, issue.3, pp.269-271, 1978.

]. Cas00 and . J. Cassell, Embodied Conversational Agents, 2000.

[. F. Colburn-r and D. S. , The Role of Eye Gaze in Avatar Mediated Conversational Interfaces, 2000.

. A. Cafaro, . Gaito-r, and . H. Vilhjálmsson-h, Animating Idle Gaze in Public Places, Proceedings of the 9th International Conference on Intelligent Virtual Agents IVA '09, pp.250-256, 2009.
DOI : 10.1007/978-3-642-04380-2_28

C. S. Badler-n, Where to look? automating attending behaviors of virtual human characters, Proceedings of the third annual conference on Autonomous Agents AGENTS '99, pp.16-23, 1999.

E. [. Kasap-z and . N. Magnenat-thalmann, Realistic emotional gaze and head behavior generation based on arousal and dominance factors, MIG, pp.278-289, 2010.

. J. Cassell, . Pelachaud-c, . Badler-n, . M. Steed-man, . Achorn-b et al., Animated conversation, Proceedings of the 21st annual conference on Computer graphics and interactive techniques , SIGGRAPH '94, pp.413-420, 1994.
DOI : 10.1145/192161.192272

C. J. Thórisson-k, The power of a nod and a glance: Envelope vs. emotional feedback in animated conversational agents, Applied Artificial Intelligence, vol.13, pp.4-5, 1999.

C. J. Tartaro-a, Intersubjectivity in humanagent interaction, Interaction Studies, vol.8, issue.3, pp.391-410, 2007.

[. E. Torres-o and . Prevost-s, Turn taking vs. discourse structure: How best to model multimodal conversation, Machine Conversations Kluwer, pp.143-154, 1998.

C. and C. J. Vilhjálmsson-h, Fully embodied conversational avatars: Making communicative behaviors autonomous, Autonomous Agents and Multi-Agent Systems, vol.2, issue.1 8, pp.45-64, 1999.

[. H. Vilhjálmsson-h and . Bickmore-t, Beat: the behavior expression animation toolkit, Proceedings of the 28th annual conference on Computer graphics and interactive techniques SIGGRAPH '01, ACM, pp.477-486, 2001.

C. A. Cafaro, . Vilhjálmsson-h, . Bickmore-t, J. K. Heylen-d, and . Valgarðsson-g, First Impressions: Users??? Judgments of Virtual Agents??? Personality and Interpersonal Attitude in First Encounters, Intelligent Virtual Agents, pp.67-80, 2012.
DOI : 10.1007/978-3-642-33197-8_7

[. , L. J. Neumann-u, L. J. Deng-z, and . Neumann-u, Automated eye motion using texture synthesis Realistic eye motion synthesis by texture synthesis, Data-Driven 3D Facial Animation, pp.24-30, 2005.

]. Dou01 and . J. Doughty-m, Consideration of Three Types of Spontaneous Eyeblink Activity in Normal Humans: during Reading and Video Display Terminal Use, Primary Gaze, and while in Conversation, pp.10-712, 2001.

]. Dun74 and . J. Duncan, On the structure of speakerauditor interaction during speaking turns, Language in Society, vol.3, issue.2, pp.161-180, 1974.

E. P. Friesen-w, Unmasking the Face: A Guide to Recognizing Emotions from Facial Clues, 2003.

E. A. Manning-k, . J. Pellegrini-j, . A. Basso-m, . S. Powers-a, and . A. Sibony-p, Not looking while leaping: the linkage of blinking and saccadic gazeshifts, pp.337-344, 1994.

E. , E. C. Manning-k, and . A. Sibony-p, Eyelid movements. mechanisms and normal data, Investigative Ophthalmology & Visual Science, vol.32, issue.2, pp.387-400, 1991.

F. , F. A. Bayliss-a, and T. S. , Gaze cueing of attention: visual attention, social cognition, and individual differences, Psychological bulletin, vol.133, issue.694 10, 2007.

. [. Gautron-p, . Breton-g, and . K. Boua-touch, Image-based modeling of the human eye, IEEE Transactions on Visualization and Computer Graphics, vol.15, issue.5, pp.815-827, 2009.

F. Y. Nakano-y, Gaze and conversation dominance in multiparty interaction, 2nd Workshop on Eye Gaze in Intelligent Human Machine Interaction, 2011.

. A. Fukayama, . Ohno-t, . Mukawa-n, . Sawaki-m, and . Hagita-n, Messages embedded in gaze of interface agents --- impression management with agent's gaze, Proceedings of the SIGCHI conference on Human factors in computing systems Changing our world, changing ourselves, CHI '02, pp.41-48, 2002.
DOI : 10.1145/503376.503385

]. Ful92 and . J. Fuller, Head movement propensity, Experimental Brain Research, vol.92, issue.1, pp.152-164, 1992.

G. E. Badler-n, J. Gratch, M. Young, R. Aylett, D. Ballin et al., Visual attention and eye gaze during multiparty conversations with distractions, Intelligent Virtual Agents, pp.193-204, 2006.

G. and G. M. Dodgson-n, Eye movements and attention for behavioural animation, Journal of Visualization and Computer Animation, vol.13, issue.5 8, pp.287-300, 2002.

G. M. Kroschel-k, Evaluation of natural emotions using self assessment manikins, IEEE Workshop on Automatic Speech Recognition and Understanding (2005), pp.381-385

[. , L. S. , and B. J. Badler-n, Eye movements, saccades, and multiparty conversations, Data- Driven 3D Facial Animation, pp.79-97, 2007.

R. [. Swann-w, A very brief measure of the big-five personality domains, Journal of Research in Personality, vol.37, issue.6, pp.504-528, 2003.

C. [. Simard-r, Upper eyelid movements measured with a search coil during blinks and vertical saccades, Investigative Ophthalmology & Visual Science, vol.32, issue.13, pp.3298-3305, 1991.

. M. Gsv-*-03-]-garau, . Slater-m, . Vinayagamoorthy-v, . Brogni-a, . Steed-a et al., The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI '03, ACM, pp.529-536, 2003.

G. H. Thalmann-d, Simulating gaze attention behaviors for crowds, Computer Animation and Virtual Worlds, vol.20, issue.2, pp.111-119, 2009.

G. D. Volle-m, Gaze control in humans: eyehead coordination during orienting movements to targets within and beyond the oculomotor range, Journal of Neurophysiology, vol.58, issue.6, pp.427-459, 1987.

W. J. Gwg-*-07-]-gratch, G. J. Fast-e, and D. R. , Creating rapport with virtual agents, 7th International Conference on Intelligent Virtual Agents, pp.125-138, 2007.

H. J. Brennan-s, Speakers' eye gaze disambiguates referring expressions early during face-to-face conversation, Journal of Memory and Language, vol.57, issue.4, pp.596-615, 2007.

]. Hey06 and . D. Heylen, Head gestures, gaze and the principles of conversational structure, International journal of Humanoid Robotics, vol.3, issue.3, pp.241-267, 2006.

. Hjo-*-10-], J. J. Hodgins, and P. C. O-'sullivan, The saliency of anomalies in animated human characters, ACM Transactions on Applied Perception, vol.7, issue.22, pp.1-22, 2010.

[. D. Mutlu-bhnp07-]-heylen and N. A. , Modeling and evaluating narrative gestures for humanlike robots Generating nonverbal signals for a sensitive artificial listener, Proceedings of Robotics: Science and Systems Verbal and Nonverbal Communication Behaviours, pp.264-274, 2007.

. A. Hjalmarsson and . Oertel-c, Gaze direction as a back-channel inviting cue in dialogue, IVA 2012 workshop on Realtime Conversational Virtual Agents, 2012.

H. D. How09, Statistical Methods for Psychology, Cengage Learning, p.15, 2009.

I. L. , D. N. Itti-l, . Dhavale-n, and . H. Pighin-f, Realistic avatar eye and head animation using a neurobiological model of visual attention Photorealistic attention-based gaze animation, Proceedings of SPIE IEEE International Conference on Multimedia and Expo, pp.64-78, 2003.

I. Miyajima-t, . Fujita-k, and . Nakano-y, Avatar's gaze control to facilitate conversational turn-taking in virtual-space multi-user voice chat system, Lecture Notes in Computer Science, vol.4133, issue.10, pp.458-458, 2006.

[. Ooko-r, . Nakano-y, and . Nishida-t, Effectiveness of gaze-based engagement estimation in conversational agents, Eye Gaze in Intelligent User Interfaces, pp.85-110

I. L. Itt00, Models of Bottom-Up and Top-Down Visual Attention, 2000.

J. D. Herrera-d, N. D. Martinovski-b, and T. D. , A computational model of culture-specific conversational behavior, Lecture Notes in Computer Science, vol.4722, pp.45-56, 2007.

T. A. Johns-m, . Chapman-r, and M. N. Crowley-k, Monitoring von Augen- und Augenlid-Bewegungen mit Infrarot-Reflektionsokulographie zur Bestimmung der Schl??frigkeit bei Fahrzeugf??hrern, Somnologie - Schlafforschung und Schlafmedizin, vol.7, issue.4, pp.234-242, 2007.
DOI : 10.1007/s11818-007-0311-y

K. C. Breazeal-c, Effect of a robot on user perceptions, Proceedings of IROS 2004, pp.3559-3564, 2004.

K. M. Babington and S. B. , On the method of paired comparisons, Biometrika, vol.31, issue.34, pp.324-345, 1940.

]. Ken67 and . A. Kendon, Some functions of gaze-direction in social interaction, Acta psychologica, vol.26, issue.1, pp.22-63, 1967.

]. Ken90 and . A. Kendon, Conducting Interaction: Patterns of Behavior in Focused Encounters. Studies in Interactional Sociolinguistics, 1990.

J. [. Holland-c and . Karpov-a, 2d linear oculomotor plant mathematical model: Verification and biometric applications, ACM Transactions on Applied Perception, vol.10, issue.27, pp.1-2718, 2013.

K. P. Kopp-s, Using virtual agents to guide attention in multi-task scenarios, Intelligent Virtual Agents, pp.295-302, 2013.

. S. Kopp, M. S. Krenn-b, M. A. Pelachaud-c, P. H. Thórisson-k, and . H. Vil-hjálmsson, Towards a Common Framework for Multimodal Generation: The Behavior Markup Language, Proceedings of the 6th International Conference on Intelligent Virtual Agents IVA'06, pp.205-217, 2006.
DOI : 10.1007/11821830_17

]. Kle86 and . L. Kleinke-c, Gaze and eye contact: a research review, Psychological Bulletin, vol.100, issue.1, pp.78-100, 1986.

K. , K. E. Oyekoya-o, and . Steed-a, Modelling selective visual attention for autonomous virtual characters, Computer Animation and Virtual Worlds, vol.22, issue.4 8, pp.361-369, 2011.

[. Pelachaud-c and P. H. Peters-c, Embodied conversational characters: Representation formats for multimodal communicative behaviours, Cognitive Technologies, pp.389-415, 2011.

K. G. Kikuchi-h, Y. M. Hoashi-k, . Hidaki-y, . Kobayashi-t, and . Shirai-k, Design and Analysis: A Researcher's Handbook Controlling gaze of humanoid in communication with human, Proceedings of the IEEE, pp.15-255, 1998.

L. M. Baranoski-g, A predictive light transport model for the human iris, Computer Graphics Forum, vol.25, issue.3, pp.359-368, 2006.

L. S. and B. J. Badler-n, Eyes alive, ACM Transactions on Graphics, vol.21, issue.5 6, pp.637-644, 2002.

. A. Lefohn, S. P. Budge-b, and R. E. Caruso-r, An ocularist's approach to human iris synthesis, IEEE Computer Graphics and Applications, vol.23, issue.6, pp.70-75, 2003.
DOI : 10.1109/MCG.2003.1242384

]. Lm10a and M. B. Lance, The expressive gaze model: Using gaze to express emotion, IEEE Computer Graphics and Applications, vol.30, issue.4 8, pp.62-73, 2010.

L. B. Lm10b, Glances, glares, and glowering: how should a virtual human express emotion through gaze?, Autonomous Agents and Multi-Agent Systems, vol.20, issue.1 8, pp.50-69, 2010.

L. B. and M. X. Deng-z, Live speech driven headand-eye motion generators, IEEE Transactions on Visualization and Computer Graphics, vol.18, issue.11, p.14, 1902.

L. M. Lvw12 and . Van-welbergen-h, Designing appropriate feedback for virtual agents and robots. In Position paper at RO-MAN 2012 WorkshopRobot Feedback in Human-Robot Interaction: How to Make a Robot "Readable" for a Human Interaction Partner, p.12

L. R. Zee-d, The neurology of eye movements , 3 ed. No. 55 in Contemporary neurology series, 1999.

M. R. Breidt-m, Face reality: Investigating the uncanny valley for virtual faces, ACM SIGGRAPH ASIA 2010 Sketches SA '10, ACM, pp.1-41, 2010.

M. S. Busso-cmbb12-]-mcdonnell-r, . Breidt-m, and . H. Bülthoff-h, Generating human-like behaviors using joint, speech-driven models for conversational agents Render me real?: investigating the effect of render style on the perception of animated virtual humans, IEEE Transactions on Audio, Speech, and Language Processing ACM Transactions on Graphics, vol.20, issue.5, pp.2329-23401, 2012.

M. X. Deng-z, Natural eye motion synthesis by modeling gaze-head coupling, Proceedings of the 2009 IEEE Virtual Reality Conference VR '09, pp.143-150, 2009.

[. Devillers-l, R. A. , C. G. , R. Z. Pelachaud-c, . Mancini-m et al., Coordinating the generation of signs in multiple modalities in an affective agent, Emotion-Oriented Systems Cognitive Technologies, pp.349-367, 2011.

E. [. , B. J. Meh80, . A. Mehrabian, G. S. Marsella, and R. J. , Taming mona lisa: Communicating gaze faithfully in 2d and 3d facial projections Basic Dimensions for a General Psychological Theory Expressive behaviors for virtual worlds, Life-like characters, Cognitive Technologies, pp.1-11, 1980.

M. S. Hoshino, Head-eye animation corresponding to a conversation for cg characters, Computer Graphics Forum, vol.26, issue.6, pp.303-312, 2007.

M. , M. H. Hasegawa-s, . Koike-y, and . Sato-m, Reactive virtual human with bottom-up and top-down visual attention for gaze generation in realtime interactions, Virtual Reality Conference, pp.211-214, 2007.

*. Mcdonnell-r, . Larkin-m, C. S. Dobbyn-s, and . C. O-'sullivan, Clone attack! Perception of crowd variety, ACM Transactions on Graphics, vol.27, issue.3, pp.1-26, 2008.
DOI : 10.1145/1360612.1360625

M. R. Larkin-m, R. I. Hernández-b, O. , and S. C. Mumm-j, Eye-catching crowds: Saliency based selective variation Human-robot proxemics: physical and psychological distancing in human-robot interaction, Proceedings of the 6th international conference on Human-Robot Interaction, pp.1-55, 2009.

M. , M. N. Roberts-d, . Steed-a, . Sharkey-p, and R. J. Dickerson-p, An assessment of eye-gaze potential within immersive virtual environments, ACM Transactions on Multimedia Computing, Communications, and Applications, vol.3, issue.8, pp.1-8, 2007.

M. A. Schilbach-l, H. J. , P. S. Velichkovsky-b, and . Vogeley-k, The effects of self-involvement on attention, arousal, and facial expression during social interaction with virtual others: A psychophysiological study, Social Neuroscience, vol.1, issue.10, pp.3-4, 2006.

M. B. Shiwa-t, . Kanda-t, H. N. Ishiguro-h, S. R. Msssb10-]-martinez-s, . Szymkowiak-a et al., Footing in human-robot conversations: how robots might shape participant roles using gaze cues Using virtual agents to cue observer attention The semaine database: Annotated multimodal records of emotionally colored conversations between a person and a limited agent, Proceedings of the 4th ACM/IEEE international conference on Human-Robot Interaction HRI '09 CONTENT 2010: The Second International Conference on Creative Content Technologies, pp.61-68, 2009.

*. S. Marsella, . Xu-y, F. A. Lhommet-m, . Scherer-s, and . Shapiro-a, Virtual character performance from speech, Proceedings of the 12th ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA '13, pp.25-35, 2013.
DOI : 10.1145/2485895.2485900

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.353.1028

B. A. Normoyle, F. T. Badler-n, C. V. , and M. S. , Evaluating perceived trust from procedurally animated gaze, Proceedings of Motion on Games, MIG '13, pp.141-119148, 2013.
DOI : 10.1145/2522628.2522630

N. , N. R. , and H. S. Pelachaud-c, Computational Models of Expressive Behaviors for a Virtual Agent. Oxford series on cognitive models and architecture

N. Y. Ishii-r, Estimating user's engagement from eye-gaze behaviors in human-agent conversations, Proceedings of the 15th International Conference on Intelligent User Interfaces IUI '10, pp.139-148, 2010.

. Nakano-t, . Kato-m, I. S. Morito-y, and . S. Ki-tazawa, Blink-related momentary activation of the default mode network while viewing videos, Proceedings of the National Academy of Sciences, vol.110, issue.2, pp.702-706, 2013.
DOI : 10.1073/pnas.1214804110

N. W. Nor63, D. M. Obaid, . Kistler-f, . Endrass-b, and A. E. Wagner-j, Toward an adequate taxonomy of personality attributes: replicated factors structure in peer nomination personality ratings Cultural behaviors of virtual agents in an augmented reality environment, Journal of abnormal and social psychology Lecture Notes in Computer Science, vol.66, issue.7502, pp.574-583, 1963.

. Oo80, . J. Otteson, and . Otteson-c, Effect of teacher's gaze on children's story recall, Perceptual and Motor Skills, issue.9, 1980.

]. Opp86 and . E. Oppenheimer-p, Real time design and animation of fractal plants and trees, ACM SIGGRAPH Computer Graphics, vol.20, issue.4, pp.55-64, 1986.

O. , O. O. Steed-a, and . Pan-x, Short paper: Exploring the object relevance of a gaze animation model, Proceedings of the 17th Eurographics Conference on Virtual Environments & Third Joint Virtual Reality EGVE -JVRC'11, Eurographics Association, pp.111-114, 2011.

. [. Steptoe-w and . Steed-a, A saliencybased method of simulating visual attention in virtual scenes, Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology VRST '09, pp.199-206, 2009.

O. C. Suci-g, . H. Tannenbaum-p, A. S. Peters-c, and . Karpouzis-k, The Measurement of Meaning Investigating shared attention with a virtual agent using a gaze-based interface, Journal on Multimodal User Interfaces, vol.3, pp.14-15, 1957.

P. and P. C. Bilvi-m, Modelling gaze behavior for conversational agents, Lecture Notes in Computer Science, vol.2792, pp.93-100, 2003.

. [. Bailly-g and R. S. Elisei-f, Scrutinizing natural scenes: Controlling the gaze of an embodied conversational agent, Intelligent Virtual Agents, pp.272-282, 2007.

C. [. Ipeirotis-p, . Peters-c, R. M. Castellano-g, A. E. , R. A. et al., Running experiments on amazon mechanical turk Fundamentals of agent perception and attention modelling, Emotion-Oriented Systems, Cognitive Technologies, pp.411-419, 2010.

. M. Phn-*-09-]-poel, N. A. Heylen-d, . Meulemans-m, and . A. Van-breemen, Gaze behaviour, believability, likability and the icat, pp.61-73, 2009.

K. [. Fussell-s, . T. Torrey-cpmg13-]-pejsa, . Mutlu-b, . F. Gleicher-m-]-pamplona-v, . M. Oliveira-m et al., Comparing a computer agent with a humanoid robot Stylized and performative gaze for character animation Photorealistic models for pupil light reflex and iridal pattern deformation, Proceedings of HRI 2007, pp.145-152, 2007.

[. C. O-'sullivan, Bottom-up visual attention for virtual human animation, Proceedings of the 16th International Conference on Computer Animation and Social Agents CASA '03, pp.111-117, 2003.

. Peters-c, . Pelachaud-c, . Bevacqua-e, . Mancini-m, and . Poggi-i, A Model of Attention and Interest Using Gaze Behavior, Intelligent Virtual Agents, pp.229-240, 2005.
DOI : 10.1007/11550617_20

. [. Pelachaud-c and . De-rosis-f, Eye communication in a conversational 3d synthetic agent, pp.169-181, 2000.

P. C. Qureshi-a, Graphics for serious games: A head movement propensity model foranimating gaze shifts and blinks of virtual characters, Computers and Graphics, vol.34, issue.6, pp.677-687, 2010.

M. [. Barros-l, Automatic generation of expressive gaze in virtual animated characters: From artists craft to a behavioral animation model, Intelligent Virtual Agents, pp.401-402, 2007.

M. [. Barros-l, Towards consistency in interactive storytelling, Computers in Entertainment, vol.6, issue.3, pp.1-4123, 2008.
DOI : 10.1145/1394021.1394036

S. D. Bickmore-t, Changes in verbal and nonverbal conversational behavior in long-term interaction, Proceedings of the 14th ACM International Conference on Multimodal Interaction (2012), ICMI '12, pp.11-18

. [. Bullivant-d and H. D. Mallinson-g, A virtual environment and model of the eye for surgical simulation, Proceedings of the 21st annual conference on Computer graphics and interactive techniques SIGGRAPH '94, ACM, pp.205-212, 1994.

S. M. Crocker-m, The effect of robot gaze on processing robot utterances, Proceedings of the 31th Annual Conference of the Cognitive Science Society, p.13, 2009.

]. Sha11 and . A. Shapiro, Building a character animation system, Proceedings of the 4th International Conference on Motion in Games, pp.98-109, 2011.

N. J. Snj-*-07-]-skotte, J. L. , and C. Sjøgaard-g, Eye blink frequency during different computer tasks quantified by electrooculography, European Journal of Applied Physiology, vol.67, issue.2, pp.113-119, 2007.
DOI : 10.1007/s00421-006-0322-6

[. W. Oyekoya-o and . Steed-a, Eyelid kinematics for virtual characters, Computer Animation and Virtual Worlds, vol.21, issue.6, pp.161-171, 2010.

[. W. Steed-a, High-fidelity avatar eyerepresentation VR '08: Amplitude of human head movements associated with horizontal saccades, Virtual Reality Conference, pp.111-114, 1999.

S. , S. J. , and W. L. Goldstein-r, The endogenous eyeblink, Psychophysiology, vol.21, issue.1, pp.22-33, 1984.

C. [. Matthews-i and . K. Hodgins-j, Modeling and animating eye blinks, ACM Transactions on Applied Perception, vol.8, issue.13, pp.1-1717, 2011.

T. S. Silverman-i, Pupillometry: A sexual selection approach, Evolution and Human Behavior, vol.25, issue.4, pp.221-228, 2004.

V. , V. M. Blanco-g, and . Paiva-a, Providing gender to embodied conversational agents, Lecture Notes in Computer Science, vol.6895, pp.148-154, 2011.

. Vbr-*-03-], . F. Vanderwerf, R. D. Brassinga-p, . Aramideh-m, . Ongerboer-de et al., Eyelid movements: behavioral studies of blinking in humans under different stimulus conditions, J Neurophysiol, vol.89, issue.5, pp.2784-2796, 2003.

C. H. Vilhjálmsson-h, BodyChat, Proceedings of the second international conference on Autonomous agents , AGENTS '98, pp.269-276, 1998.
DOI : 10.1145/280765.280843

V. H. Cantelmo-n, C. J. , E. N. Chafai, K. M. Kopp-s, M. S. Mancini-m et al., The behavior markup language: Recent developments and challenges An eye gaze model for dyadic interaction in an immersive virtual environment: Practice and experience Animating conversation in online games, Proceedings of the 7th International Conference on Intelligent Virtual Agents IVA '07 Entertainment Computing ? ICEC 2004 of Lecture Notes in Computer Science, pp.99-111, 2004.

O. [. Liu-k, . Wecker-l, and . Samavati-f, Videorealistic image-based eye animation via statistically driven state machines. The Visual Computer Iris synthesis: A reverse subdivision application, Proceedings of the 3rd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia GRAPHITE '05, pp.1201-1216, 2005.

[. H. Lesmana-m and N. D. Pai-d, Eyecatch: simulating visuomotor coordination for object interception, ACM Transactions on Graphics, vol.31, issue.5, pp.1-4210, 2012.

. Yoshikawa-y, . Shinozawa-k, . Ishiguro-h, . Hagita-n, and . Miyamoto-t, Responsive robot gaze to interaction partner, Robotics: Science and Systems II, p.13, 2006.
DOI : 10.15607/RSS.2006.II.037

. [. Forchheimer-r and . Pandzic-i, On creating multimodal virtual humans?real time speech driven facial gesturing, Multimedia Tools and Applications, vol.54, issue.1 10, pp.165-179, 2011.

R. [. Hoyet-l and . R. Mcdon-nell, Evaluating the effect of emotion on gender recognition in virtual humans, Proceedings of the ACM Symposium on Applied Perception SAP '13, ACM, pp.45-49, 2013.

Z. J. Schmid-n, A model based, anatomy based method for synthesizing iris images, Proceedings of the 2006 International Conference on Advances ICB'06, pp.428-435, 2006.