The Human Face of Mobile

Abstract : As the landscape around Big data continues to exponentially evolve, the « big » facet of Big data is no more number one priority of researchers and IT professionals. The race has recently become more about how to sift through torrents of data to find the hidden diamond and engineer a better, smarter and healthier world. The ease with which our mobile captures daily data about ourselves makes it an exceptionally suitable means for ultimately improving the quality of our lives and gaining valuable insights into our affective, mental and physical state. This paper takes the first exploratory step into this direction by using the mobile to process and analyze the “digital exhaust” it collects to automatically recognize our emotional states and accordingly respond to them in the most effective and “human” way possible. To achieve this we treat all technical, psycho-somatic, and cognitive aspects of emotion observation and prediction, and repackage all these elements into a mobile multimodal emotion recognition system that can be used on any mobile device.
Complete list of metadatas

Cited literature [37 references]  Display  Hide  Download

https://hal.inria.fr/hal-01397139
Contributor : Hal Ifip <>
Submitted on : Tuesday, November 15, 2016 - 3:03:28 PM
Last modification on : Friday, August 9, 2019 - 3:18:07 PM
Long-term archiving on : Thursday, March 16, 2017 - 5:01:05 PM

File

978-3-642-55032-4_1_Chapter.pd...
Files produced by the author(s)

Licence


Distributed under a Creative Commons Attribution 4.0 International License

Identifiers

Citation

Hajar Mousannif, Ismail Khalil. The Human Face of Mobile. 2nd Information and Communication Technology - EurAsia Conference (ICT-EurAsia), Apr 2014, Bali, Indonesia. pp.1-20, ⟨10.1007/978-3-642-55032-4_1⟩. ⟨hal-01397139⟩

Share

Metrics

Record views

475

Files downloads

217