Next-Point Prediction Metrics for Perceived Spatial Errors - Archive ouverte HAL Access content directly
Conference Papers Year :

Next-Point Prediction Metrics for Perceived Spatial Errors

(1, 2) , (2) , (3, 4) , (4) , (5)
1
2
3
4
5
Daniel Vogel
  • Function : Author
  • PersonId : 913043
Ricardo Jota
  • Function : Author
  • PersonId : 996813
Géry Casiez

Abstract

Touch screens have a delay between user input and corresponding visual interface feedback, called input “latency” (or “lag”). Visual latency is more noticeable during continuous input actions like dragging, so methods to display feedback based on the most likely path for the next few input points have been described in research papers and patents. Designing these “next-point prediction” methods is challenging, and there have been no standard metrics to compare different approaches. We introduce metrics to quantify the probability of 7 spatial error “side-effects” caused by next-point prediction methods. Types of side-effects are derived using a thematic analysis of comments gathered in a 12 participants study covering drawing, dragging, and panning tasks using 5 state-of- the-art next-point predictors. Using experiment logs of actual and predicted input points, we develop quantitative metrics that correlate positively with the frequency of perceived side-effects. These metrics enable practitioners to compare next- point predictors using only input logs.
Fichier principal
Vignette du fichier
NextPointPredictionMetrics.pdf (2.86 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-01420670 , version 1 (12-04-2021)

Identifiers

Cite

Mathieu Nancel, Daniel Vogel, Bruno de Araujo, Ricardo Jota, Géry Casiez. Next-Point Prediction Metrics for Perceived Spatial Errors. In proceedings of UIST'16, the 29th ACM Symposium on User Interface Software and Technology, Oct 2016, Tokyo, Japan. pp.271-285, ⟨10.1145/2984511.2984590⟩. ⟨hal-01420670⟩
243 View
136 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More