Skip to Main content Skip to Navigation
Conference papers

The Human Face of Mobile

Abstract : As the landscape around Big data continues to exponentially evolve, the « big » facet of Big data is no more number one priority of researchers and IT professionals. The race has recently become more about how to sift through torrents of data to find the hidden diamond and engineer a better, smarter and healthier world. The ease with which our mobile captures daily data about ourselves makes it an exceptionally suitable means for ultimately improving the quality of our lives and gaining valuable insights into our affective, mental and physical state. This paper takes the first exploratory step into this direction by using the mobile to process and analyze the “digital exhaust” it collects to automatically recognize our emotional states and accordingly respond to them in the most effective and “human” way possible. To achieve this we treat all technical, psycho-somatic, and cognitive aspects of emotion observation and prediction, and repackage all these elements into a mobile multimodal emotion recognition system that can be used on any mobile device.
Complete list of metadata

Cited literature [37 references]  Display  Hide  Download
Contributor : Hal Ifip Connect in order to contact the contributor
Submitted on : Tuesday, November 15, 2016 - 3:03:28 PM
Last modification on : Friday, May 28, 2021 - 5:08:01 PM
Long-term archiving on: : Thursday, March 16, 2017 - 5:01:05 PM


Files produced by the author(s)


Distributed under a Creative Commons Attribution 4.0 International License



Hajar Mousannif, Ismail Khalil. The Human Face of Mobile. 2nd Information and Communication Technology - EurAsia Conference (ICT-EurAsia), Apr 2014, Bali, Indonesia. pp.1-20, ⟨10.1007/978-3-642-55032-4_1⟩. ⟨hal-01397139⟩



Record views


Files downloads