Browse > Article
http://dx.doi.org/10.21289/KSIC.2021.24.6.693

Human Activity Recognition with LSTM Using the Egocentric Coordinate System Key Points  

Wesonga, Sheilla (Dept. of Electronic Eng., Kyungsung University)
Park, Jang-Sik (Dept. of Electronic Eng., Kyungsung University)
Publication Information
Journal of the Korean Society of Industry Convergence / v.24, no.6_1, 2021 , pp. 693-698 More about this Journal
Abstract
As technology advances, there is increasing need for research in different fields where this technology is applied. On of the most researched topic in computer vision is Human activity recognition (HAR), which has widely been implemented in various fields which include healthcare, video surveillance and education. We therefore present in this paper a human activity recognition system based on scale and rotation while employing the Kinect depth sensors to obtain the human skeleton joints. In contrast to previous approaches that use joint angles, in this paper we propose that each limb has an angle with the X, Y, Z axes which we employ as feature vectors. The use of the joint angles makes our system scale invariant. We further calculate the body relative direction in the egocentric coordinates in order to provide the rotation invariance. For the system parameters, we employ 8 limbs with their corresponding angles each having the X, Y, Z axes from the coordinate system as feature vectors. The extracted features are finally trained and tested with the Long short term memory (LSTM) Network which gives us an average accuracy of 98.3%.
Keywords
Human Activity Recognition (HAR); Kinect Depth Sensor; Long Short Term Memory (LSTM);
Citations & Related Records
연도 인용수 순위
  • Reference
1 M. Tentori and J. Favela, "Activity-aware computing for healthcare", Pervasive Computing IEEE, vol. 7, pp. 51-57, 2008.   DOI
2 S. Qiu, H. Zhao, N. Jiang, Z. Wang, L. Liu, Y. An, H. Zhao, X. Miao, R. Liu, G. Fortino, Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges, Information Fusion, Volume 80.
3 K. K. Htike, O. O. Khalif a, H. A. Mohd R amli and M. A. M. Abushariah, "Human activity recognition for video surveillance using sequences of postures," The Third International Conference on e-Technologies and Networks for Development (ICeND2014), pp. 79-82, 2014.
4 A. Jalal and M. A. Zeb, Security Enhancement for E-learning portal, Int. J. Comput. Sci. Netw. Security 8 (2008), no. 3, 41- 45.
5 J. Wang, Y. Chen, S. Hao, X. Peng, & L. Hu, "Deep learning for sensor-based activity recognition: a survey. Pattern Recogn". Lett. 119, 3-11 (2019).
6 L.F. Yeung, Z. Yang, K.C.C. Cheng, D. Du. and R.K.Y. Tong, Effects of camera viewing angles on tracking kinematic gait patterns using Azure Kinect, Kinect v2 and Orbbec Astra Pro v2. Gait & Posture, 87, pp. 19-26. 2021.   DOI
7 Y. Zhao," Deep Residual Bidir-LSTM for Human Activity Recognition Using Wearable Sensors," 1-5, 2018.
8 J. Guo, K. Tian, K. Ye and C. -Z. Xu, "MA-LSTM: A Multi-Attention Based LSTM for Complex Pattern Extraction," 2020 25th International Conference on Pattern Recognition (ICPR), pp. 3605-3611, 2021.
9 J. F. Kolen; S. C. Kremer, "Gradient Flow in Recurrent Nets: The Difficulty of Learning LongTerm Dependencies," in A Field Guide to Dynamical Recurrent Networks, IEEE, pp. 237-243, 2001.
10 H. Sepp & S. Jurgen. Long Short-term Memory. Neural computation. 9. 1997.
11 Y. Zhao, Deep Residual Bidir-LSTM for Human Activity Recognition Using Wearable Sensors. Math. Probl. Eng. 2018, 2018.
12 J. Wang, Y. Chen, S. Hao, X. Peng, L. Hu, Deep learning for sensor-based activity recognition: A survey, Pattern Recognition Letters, Volume 119, Pages 3-1 2019.   DOI
13 Y. Bengio, P. Simard and P. Frasconi, "Learning long-term dependencies with gradient descent is difficult," in IEEE Transactions on Neural Networks, vol. 5, no. 2, pp. 157-166, March 1994.   DOI
14 FJ, Ordonez, D. Roggen Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors. 2016