Defining and Analyzing a Gesture Set for Interactive TV Remote on Touchscreen Phones

Abstract

In this paper, we recruited 20 participants preforming user-defined gestures on a touch screen phone for 22 TV remote commands. Totally 440 gestures were recorded, analyzed and paired with think-aloud data for these 22 referents. After analyzing these gestures according to extended taxonomy of surface gestures and agreement measure, we presented a user-defined gesture set for interactive TV remote on touch screen phones. Despite the insight of mental models and analysis of gesture set, our findings indicate that people prefer using single-handed thumb and also prefer eyes-free gestures that need no attention switch under TV viewing scenario. Multi-display is useful in text entry and menu access tasks. Our results will contribute to better gesture design in the field of interaction between TVs and touchable mobile phones.

Publication
Proceedings - 2014 IEEE International Conference on Ubiquitous Intelligence and Computing, 2014 IEEE International Conference on Autonomic and Trusted Computing, 2014 IEEE International Conference on Scalable Computing and Communications and Associated Symposia/Workshops, UIC-ATC-ScalCom 2014