Semantic Modelling in Support of Adaptive Multimodal Interface Design

Abstract : The design of multimodal interfaces requires intelligent data interpretation in order to guarantee seamless adaptation to the user’s needs and context. HMI (human-machine interaction) design accommodates varying forms of interaction patterns, depending on what is most appropriate for a particular user at a particular time. These design patterns are a powerful means of documenting reusable design know-how. The semantic modelling framework in this paper captures the available domain knowledge in the field of multimodal interface design and supports adaptive HMIs. A collection of multimodal design patterns is constructed from a diversity of real-world applications and organized into a meaningful repository. This enables a uniform and unambiguous description easing their identification, comprehensibility and applicability.
Document type :
Conference papers
Complete list of metadatas

Cited literature [16 references]  Display  Hide  Download

https://hal.inria.fr/hal-01510520
Contributor : Hal Ifip <>
Submitted on : Wednesday, April 19, 2017 - 3:34:53 PM
Last modification on : Thursday, April 20, 2017 - 1:07:23 AM

File

978-3-642-40498-6_54_Chapter.p...
Files produced by the author(s)

Licence


Distributed under a Creative Commons Attribution 4.0 International License

Identifiers

Citation

Elena Tsiporkova, Anna Hristoskova, Tom Tourwé, Tom Stevens. Semantic Modelling in Support of Adaptive Multimodal Interface Design. 14th International Conference on Human-Computer Interaction (INTERACT), Sep 2013, Cape Town, South Africa. pp.627-634, ⟨10.1007/978-3-642-40498-6_54⟩. ⟨hal-01510520⟩

Share

Metrics

Record views

115

Files downloads

129