Skip to main navigation Skip to search Skip to main content

A Novel Gait Pattern Recognition Method Based on LSTM-CNN for Lower Limb Exoskeleton

  • Chao feng Chen
  • , Zhi jiang Du
  • , Long He
  • , Yong jun Shi
  • , Jia qi Wang
  • , Wei Dong*
  • *Corresponding author for this work
  • Harbin Institute of Technology
  • China South Industries Group Corp.

Research output: Contribution to journalArticlepeer-review

Abstract

This paper describes a novel gait pattern recognition method based on Long Short-Term Memory (LSTM) and Convolutional Neural Network (CNN) for lower limb exoskeleton. The Inertial Measurement Unit (IMU) installed on the exoskeleton to collect motion information, which is used for LSTM-CNN input. This article considers five common gait patterns, including walking, going up stairs, going down stairs, sitting down, and standing up. In the LSTM-CNN model, the LSTM layer is used to process temporal sequences and the CNN layer is used to extract features. To optimize the deep neural network structure proposed in this paper, some hyperparameter selection experiments were carried out. In addition, to verify the superiority of the proposed recognition method, the method is compared with several common methods such as LSTM, CNN and SVM. The results show that the average recognition accuracy can reach 97.78%, which has a good recognition effect. Finally, according to the experimental results of gait pattern switching, the proposed method can identify the switching gait pattern in time, which shows that it has good real-time performance.

Original languageEnglish
Pages (from-to)1059-1072
Number of pages14
JournalJournal of Bionic Engineering
Volume18
Issue number5
DOIs
StatePublished - Sep 2021

Keywords

  • Gait pattern recognition
  • LSTM-CNN
  • Lower limb exoskeleton
  • Real-time performance
  • Recognition accuracy

Fingerprint

Dive into the research topics of 'A Novel Gait Pattern Recognition Method Based on LSTM-CNN for Lower Limb Exoskeleton'. Together they form a unique fingerprint.

Cite this