Skip to main navigation Skip to search Skip to main content

Data Glove-based Personalized Continuous Gesture Segmentation

  • Liufeng Fan
  • , Zhan Zhang
  • , Yiwei Wang
  • , Decheng Zuo*
  • , Yinran Wang
  • , Zhongyuan Chen
  • *Corresponding author for this work
  • Faculty of Computing, Harbin Institute of Technology

Research output: Contribution to journalConference articlepeer-review

Abstract

In recent years, gesture recognition based on data gloves has attracted increasing attention as a human-computer interaction (HCI) method that is natural, convenient, stable, robust, easy to recognize, and applicable to various usage environments. This research first proposes an advanced smart data glove that integrates cutting-edge flexible capacitive sensors on the fingertips and a 6-axis IMU on the back of the hand to recognize gestures. Secondly, this study proposes a personalized continuous gesture segmentation (PCGS) model that can adaptively calculate the most appropriate gesture segmenting threshold based on the current user and introduces the multi-sliding window theory and kinematic knowledge to perform personalized gesture segmentation. The accuracy of gesture segmentation can reach 94.3%. The result shows that our PCGS model achieves an average segmentation accuracy of 94.3% and outperforms the state-of-the-art pproaches by 11.2% to 18.5%.

Keywords

  • Gesture recognition
  • Human-Computer Interaction
  • Human-Machine Interaction Techniques and Devices
  • Internet of Things

Fingerprint

Dive into the research topics of 'Data Glove-based Personalized Continuous Gesture Segmentation'. Together they form a unique fingerprint.

Cite this