Human body action recognition based on quaternion spatial temporal convolutional neural network
DOI:
Author:
Affiliation:

School of Information Engineering, Northeast Electric Power University, Jilin 132012, China

Clc Number:

TP391.4TH164

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Traditional CNN models are suitable for the feature extraction of gray image sequences or the color image separate channels, which ignores the interdependency among the channels and destroys the color features of real world objects, thereby affects the accuracy rate of human body action recognition. In order to solve this problem, a human body action recognition method is proposed based on quaternion spatialtemporal convolutional neural network (QSTCNN). Firstly, codebook algorithm is adopted to process all the images in the sample set and extract the key regions of human body motion in the images. Then,the quaternion matrix expression of the color images is taken as the input of the QSTCNN. The spatial convolutional layer of CNN is expended as a quaternion spatial convolutional layer. The values of the red, green, and blue channels of the color images are considered simultaneously as a whole in a spatial convolutional layer to conduct the extraction of the action spatial features, and avoid the loss of spatial relationships. The dynamical information of adjacent frames is extracted in a temporal convolutional layer. Finally, experiment was conducted, in which QSTCNN, gray single channel CNN (GrayCNN) and RGB three channel CNN (3 ChannelCNN) were compared. The experiment result demonstrates that the QSTCNN boosts the performance of action recognition, the proposed method is superior to other popular methods and achieves the recognition rates of 85.34% and 80.2% in the Weizmann and UCF sports datasets, respectively.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: December 23,2017
  • Published: