loading page

TouchSkin: Proprioceptive Input with Small  Devices   for Mobile AR during Difficult Operating Conditions       
  • Chang Hyeon Lee,
  • Youngwon Kim,
  • Gerard Jounghyun Kim
Chang Hyeon Lee

Corresponding Author:[email protected]

Author Profile
Youngwon Kim
Gerard Jounghyun Kim

Abstract

A discrete input method for a smart watch based on proprioception called TouchSkin is proposed. We consider (1) enlarging the interaction space, (2) mapping it to the user’s body part (while maintaining the visual feedback e.g. on the AR glass or the watch), and (3) yet maintaining high interaction performance (both accuracy and time-wise) by taking advantage of the proprioceptive sense. The central idea is that, with the user’s attention kept to the visual layout, one can improve performance and usability by relying on the instinctive proprioceptive sense on the enlarged interaction space. We explore the design space of TouchSkin by experimentally testing for its robustness in difficult operating conditions e.g. with small layout display and under user motion, as compared to the conventional touch-only input method. The results have shown that the proprioceptive selection by TouchSkin did improve the input performance in accuracy with comparable completion time over the conventional touch based input method.