Abstract : The design of multimodal interfaces requires intelligent data interpretation in order to guarantee seamless adaptation to the user’s needs and context. HMI (human-machine interaction) design accommodates varying forms of interaction patterns, depending on what is most appropriate for a particular user at a particular time. These design patterns are a powerful means of documenting reusable design know-how. The semantic modelling framework in this paper captures the available domain knowledge in the field of multimodal interface design and supports adaptive HMIs. A collection of multimodal design patterns is constructed from a diversity of real-world applications and organized into a meaningful repository. This enables a uniform and unambiguous description easing their identification, comprehensibility and applicability.
https://hal.inria.fr/hal-01510520 Contributor : Hal IfipConnect in order to contact the contributor Submitted on : Wednesday, April 19, 2017 - 3:34:53 PM Last modification on : Wednesday, November 3, 2021 - 2:56:57 PM
Elena Tsiporkova, Anna Hristoskova, Tom Tourwé, Tom Stevens. Semantic Modelling in Support of Adaptive Multimodal Interface Design. 14th International Conference on Human-Computer Interaction (INTERACT), Sep 2013, Cape Town, South Africa. pp.627-634, ⟨10.1007/978-3-642-40498-6_54⟩. ⟨hal-01510520⟩