Skip to Main content Skip to Navigation
Conference papers

Semantic Modelling in Support of Adaptive Multimodal Interface Design

Abstract : The design of multimodal interfaces requires intelligent data interpretation in order to guarantee seamless adaptation to the user’s needs and context. HMI (human-machine interaction) design accommodates varying forms of interaction patterns, depending on what is most appropriate for a particular user at a particular time. These design patterns are a powerful means of documenting reusable design know-how. The semantic modelling framework in this paper captures the available domain knowledge in the field of multimodal interface design and supports adaptive HMIs. A collection of multimodal design patterns is constructed from a diversity of real-world applications and organized into a meaningful repository. This enables a uniform and unambiguous description easing their identification, comprehensibility and applicability.
Document type :
Conference papers
Complete list of metadata

Cited literature [16 references]  Display  Hide  Download
Contributor : Hal Ifip Connect in order to contact the contributor
Submitted on : Wednesday, April 19, 2017 - 3:34:53 PM
Last modification on : Wednesday, November 3, 2021 - 2:56:57 PM


Files produced by the author(s)


Distributed under a Creative Commons Attribution 4.0 International License



Elena Tsiporkova, Anna Hristoskova, Tom Tourwé, Tom Stevens. Semantic Modelling in Support of Adaptive Multimodal Interface Design. 14th International Conference on Human-Computer Interaction (INTERACT), Sep 2013, Cape Town, South Africa. pp.627-634, ⟨10.1007/978-3-642-40498-6_54⟩. ⟨hal-01510520⟩



Record views


Files downloads