Featured News

Latest Research

  • Facilitating Text Entry on Smartphones with QWERTY Keyboard for Users with Parkinson’s Disease

    (CHI’21) Yutao Wang, Ao Yu, Xin Yi*, Yuanwei Zhang, Ishan Chatterjee, Shwetak Patel, Yuanchun Shi
    QWERTY is the primary smartphone text input keyboard configuration. However, insertion and substitution errors caused by hand tremors, often experienced by users with Parkinson’s disease, can severely affect typing efficiency and user experience. In this paper, we investigated Parkinson’s users’ typing behavior on smartphones. In particular, we identified and compared the typing characteristics generated by users with and without Parkinson’s symptoms. We then proposed an elastic probabilistic model for input prediction. By incorporating both spatial and temporal features, this model generalized the classical statistical decoding algorithm to correct in sertion, substitution and omission errors, while maintaining direct physical interpretation.
    [Paper]

  • FaceSight: Enabling Hand-to-Face Gesture Interaction on AR Glasses with a Downward-Facing Camera Vision

    (CHI’21) Yueting Weng, Chun Yu*, Yingtian Shi, Yuhang Zhao, Yukang Yan, Yuanchun Shi
    We present FaceSight, a computer vision-based hand-to-face gesture sensing technique for AR glasses. FaceSight fixes an infrared camera onto the bridge of AR glasses to provide extra sensing capability of the lower face and hand behaviors. We obtained 21 hand-to-face gestures and demonstrated the potential interaction benefits through five AR applications. We designed and implemented an algorithm pipeline that achieves classification accuracy of all gestures at 83.06%, proved by the data of 10 users. Due to the compact form factor and rich gestures, we recognize FaceSight as a practical solution to augment input capability of AR glasses in the future.
    [Paper]