DOI QR코드

DOI QR Code

Bio-signal Data Augumentation Technique for CNN based Human Activity Recognition

CNN 기반 인간 동작 인식을 위한 생체신호 데이터의 증강 기법

  • Gerelbat BatGerel (Department of Medical IT Engineering, Soonchunhyang University) ;
  • Chun-Ki Kwon (Department of Medical IT Engineering, Soonchunhyang University)
  • Received : 2023.06.19
  • Accepted : 2023.06.30
  • Published : 2023.06.30

Abstract

Securing large amounts of training data in deep learning neural networks, including convolutional neural networks, is of importance for avoiding overfitting phenomenon or for the excellent performance. However, securing labeled training data in deep learning neural networks is very limited in reality. To overcome this, several augmentation methods have been proposed in the literature to generate an additional large amount of training data through transformation or manipulation of the already acquired traing data. However, unlike training data such as images and texts, it is barely to find an augmentation method in the literature that additionally generates bio-signal training data for convolutional neural network based human activity recognition. Thus, this study proposes a simple but effective augmentation method of bio-signal training data for convolutional neural network based human activity recognition. The usefulness of the proposed augmentation method is validated by showing that human activity is recognized with high accuracy by convolutional neural network trained with its augmented bio-signal training data.

합성곱 신경망을 비롯하여 딥러닝 신경망의 학습에서 많은 양의 훈련데이터의 확보는 과적합 현상을 피하고 우수한 성능을 가지기 위해서 매우 중요하다. 하지만, 딥러닝 신경망에서의 레이블화된 훈련데이터의 확보는 실제로는 매우 제한적이다. 이를 극복하기 위해, 이미 획득한 훈련데이터를 변형, 조작 등으로 추가로 훈련데이터를 생성하는 여러 증강 방법이 제안되었다. 하지만, 이미지, 문자 등의 훈련데이터와 달리, 인간 동작 인식을 행하는 합성곱 신경망의 생체신호 훈련데이터를 추가로 생성하는 증강 방법은 연구 문헌에서 찾아보기 어렵다. 본 연구에서는 합성곱 신경망에 기반한 인간 동작 인식을 위한 생체신호 훈련데이터를 생성하는 간편하지만, 효과적인 증강 방법을 제안한다. 본 연구의 제안된 증강 방법의 유용성은 추가로 생성된 생체신호 훈련데이터로 학습하여 합성곱 신경망이 인간 동작을 높은 정확도로 인식하는 것을 보임으로써 검증하였다.

Keywords

Acknowledgement

본 연구는 부분적으로 2021년도 정부(교육부)의 재원으로 한국연구재단의 지역대학우수과학자지원 사업(No.2021R111A3043994)의 지원을 받아 수행된 것임.

References

  1. K. C. Hong, H. S. Kim, and Y. H. Han. (2021, Dec.). CNN-based Sign Language Translation Program for the Deaf. Journal of Korea Institute of Convergence Signal Processing. 22(4), pp. 206-212. DOI:10.23087/jkicsp.2021.22.4.009.
  2. H. Park. (2021, Mar.). A Review of 3D Object Tracking Methods Using Deep Learning. Journal of Korea Institute of Convergence Signal Processing. 22(1), pp. 30-37.
  3. J. Park and C. Kwon. (2021, Jan.). Korean Finger Number Gesture Recognition Based on CNN using Surface Electromyography Signals. Journal of Electrical Engineering & Technology. 16, pp. 591-598. DOI:10.1007/s42835-020-00587-3.
  4. S. Yang, W. Xiao, M. Zhang, S. Guo, J. Zhao, F. Shen, (2022, April), Image Data Augmentation for Deep Learning:A Survey, arXiv:2204.08610v1.
  5. C. Shorten. T. Khoshgoftaar, and B. Furht. (2021). Text Data Augumentation for Deep Learning. Journal of Big Data. 8(101). https://doi.org/10.1186/s40537-021-00492-0.
  6. S. Harada, H. Hayashi, and S. Uchida, (2018). Biosignal Data Augumentation Based on Generative Adversarial Networks, 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 18-21 July 2018.
  7. G. Batgerel and C. Kwon, (2023, March), The Study on Effect of sEMG Sampling Frequency on Learning Performance in CNN based Finger Number Recognition, Journal of Korea Institute of Convergence Signal Processing. 24(1), pp. 51-56. https://doi.org/10.23087/jkicsp.2023.24.1.007.
  8. G. Batgerel and C. Kwon, "Bio-signal Data Augmentation Method for CNN based Human Activity Recognition," in Proc. of the Summer Conf. 2023 on Convergence Signal Processing Technology, Busan, South Korea, 2023.
  9. https://colab.research.google.com/notebooks/welcome.jpynb,Google.com.
  10. J. Park and C. Kwon. (2018, Aug.). Study on forearm muscles and electrode placements for CNN based Korean number gesture recognition using sEMG signals. Journal of Korea Academia- Industrial Cooperation Society. 19(8), pp. 260-267.