DOI QR코드

DOI QR Code

Design and Implementation of Hand Gesture Recognizer Based on Artificial Neural Network

인공신경망 기반 손동작 인식기의 설계 및 구현

  • Kim, Minwoo (School of Electronics and Information Engineering, Korea Aerospace University) ;
  • Jeong, Woojae (School of Electronics and Information Engineering, Korea Aerospace University) ;
  • Cho, Jaechan (School of Electronics and Information Engineering, Korea Aerospace University) ;
  • Jung, Yunho (School of Electronics and Information Engineering, Korea Aerospace University)
  • 김민우 (한국항공대학교 항공전자정보공학과) ;
  • 정우재 (한국항공대학교 항공전자정보공학과) ;
  • 조재찬 (한국항공대학교 항공전자정보공학과) ;
  • 정윤호 (한국항공대학교 항공전자정보공학과)
  • Received : 2018.11.20
  • Accepted : 2018.12.12
  • Published : 2018.12.31

Abstract

In this paper, we propose a hand gesture recognizer using restricted coulomb energy (RCE) neural network, and present hardware implementation results for real-time learning and recognition. Since RCE-NN has a flexible network architecture and real-time learning process with low complexity, it is suitable for hand recognition applications. The 3D number dataset was created using an FPGA-based test platform and the designed hand gesture recognizer showed 98.8% recognition accuracy for the 3D number dataset. The proposed hand gesture recognizer is implemented in Intel-Altera cyclone IV FPGA and confirmed that it can be implemented with 26,702 logic elements and 258Kbit memory. In addition, real-time learning and recognition verification were performed at an operating frequency of 70MHz.

본 논문에서는 RCE (restricted coulomb energy) 신경망을 이용한 손동작 인식기를 제안하고, 이의 실시간 학습 및 인식을 위한 하드웨어 구현 결과를 제시한다. RCE 신경망은 네트워크 구조가 학습에 따라 유동적이며, 학습 알고리즘이 여타 신경망에 비해 비교적 간단하기 때문에 실시간 학습 및 인식이 가능하므로 손동작 인식기에 적합한 장점을 갖는다. FPGA기반 검증 플랫폼을 사용하여 3D 숫자 데이터 셋을 생성하였으며, 설계된 손동작 인식기는 3D 숫자 데이터 셋에 대해 98.8%의 인식 정확도를 나타냈다. 제안된 손동작 인식기는 Intel-Altera cyclone IV FPGA기반 구현 결과, 26,702개의 logic elements로 구현 가능함을 확인하였으며, 70MHz의 동작 주파수로 실시간 학습 및 인식 결과에 대한 검증을 수행하였다.

Keywords

HHHHBI_2018_v22n6_675_f0001.png 이미지

그림 1. RCE 신경망 구조 Fig. 1. Structure of RCE neural network.

HHHHBI_2018_v22n6_675_f0002.png 이미지

그림 2. 손동작 인식기의 블록도 Fig. 2. Block diagram of hand gesture recognizer.

HHHHBI_2018_v22n6_675_f0003.png 이미지

그림 3. 네트워크 제어부와 신경망 간의 인터페이스 Fig. 3. Interface between network control unit and neural network.

HHHHBI_2018_v22n6_675_f0004.png 이미지

그림 4. 뉴런 블록의 블록도 Fig. 4. Block diagram of the neuron block.

HHHHBI_2018_v22n6_675_f0005.png 이미지

그림 5. 손동작 인식 검증 플랫폼의 블록도 Fig. 5. Block diagram of the test platform for hand gesture recognition.

HHHHBI_2018_v22n6_675_f0006.png 이미지

그림 6. 손동작 인식 검증 플랫폼의 실험 환경 Fig. 6. Experiment environment of the test platform for hand gesture recognition.

HHHHBI_2018_v22n6_675_f0007.png 이미지

그림 7. 혼돈 행렬 Fig. 7. Confusion matrix.

HHHHBI_2018_v22n6_675_f0008.png 이미지

그림 8. 학습 데이터의 수에 따른 인식 정확도 Fig. 8. Recognition accuracy for the number of learning data.

표 1. 제안된 손동작 인식기의 FPGA 기반 구현 결과 Table 1. FPGA implementation results of the proposed hand gesture recognizer

HHHHBI_2018_v22n6_675_t0001.png 이미지

References

  1. S. Seneviratne, Y. Hu, T. Nguyen, G. Lan, and S. khalifa, "A survey of wearable devices and challenges," IEEE Communications Survey & Tutorials, Vol. 19, No. 4, pp. 2573-2620, Jul. 2017. https://doi.org/10.1109/COMST.2017.2731979
  2. J. Yu and Z. Fu Wang, "A video, text, and speech-driven realistic 3-D virtual head for human-machine interface," IEEE Transactions on Cybernetics, Vol. 45, No. 5, pp. 977-988, May 2015.
  3. Z. Lu, X. Chen, Q. Li, X. Zhang, and P. Zhou, "A hand gesture recognition framework and wearable gesture-based interaction prototype for mobile devices," IEEE Transactions on Human-Machine Systems, Vol. 44, No. 2, pp. 293-299, Apr. 2014. https://doi.org/10.1109/THMS.2014.2302794
  4. H. Cheng, L. Yang, and Z. Liu, "Survey on 3D hand gesture recognition," IEEE Transactions on Circuits and Systems for Video Technology, Vol. 26, No. 9, pp. 1659-1673, Sep. 2016. https://doi.org/10.1109/TCSVT.2015.2469551
  5. Z. Chaohui, D. Xiaohui, X. shuoyu, and S. Zheng, "Tiny hand gesture recognition without localization via a deep convolutional network," IEEE Transactions on Consumer Electronics, Vol. 63, No. 3, pp. 251-257, Aug. 2017. https://doi.org/10.1109/TCE.2017.014971
  6. E. Ohn-Bar, and M. M. Trivedi, "Hand gesture recognition in real time for automotive interfaces: a multimodal vision-based approach and evaluations," IEEE Transactions on Intelligent Transportation Systems, Vol. 15, No. 6, pp. 2368-2377, Dec. 2014. https://doi.org/10.1109/TITS.2014.2337331
  7. R. Xie and J. Cao, "Accelerometer-based hand gesture recognition by neural network and similarity matching," IEEE Sensors Journal, Vol. 16, No. 11, pp. 4537-4535, Jun. 2016. https://doi.org/10.1109/JSEN.2016.2546942
  8. S. Jiang, B. Lv, W. Guo, C. Zhang, H. Wang, X. Sheng, and P. B. Shull, "Feasibility of wrist-worn, real-time hand, and surface gesture recognition via sEMG and IMU sensing," IEEE Transactions on Industrial Informatics, Vol. 14, No. 8, pp. 3376-3385, Aug. 2018. https://doi.org/10.1109/TII.2017.2779814
  9. Z. Zhang, Z. Tian, and M. Zhou, "Latern: dynamic continuous hand gesture recognition using FMCW radar sensor," IEEE Sensors Journal, Vol. 18, No. 8, pp. 3278-3289, Apr. 2018.
  10. Y. Hsu, C. Chu, Y. Tsai, and J. Wang, "An inertial pen with dynamic time warping recognizer for handwriting and gesture recognition," IEEE Sensors Journal, Vol. 15, No. 1, pp. 154-163, Jan. 2015. https://doi.org/10.1109/JSEN.2014.2339843
  11. R. Srivastava, and P. Sinha, "Hand movements and gestures characterization using quaternion dynamic time warping technique," IEEE Sensors Journal, Vol. 16, No. 5, pp. 1333-1341, March 2016. https://doi.org/10.1109/JSEN.2015.2482759
  12. Z. Ji, Z. Li, P. Li, and M. An, "A new effective wearable hand gesture recognition algorithm with 3-axis accelerometer," in 2015 12th International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), Zhangjiajie: China, pp. 1243-1247, Jan. 2016.
  13. R. Xie and J. Cao, "Accelerometer-based hand gesture recognition by neural network and similarity matching," IEEE Sensors Journal, Vol. 16, No. 11, pp. 4537-4545, Jun. 2016. https://doi.org/10.1109/JSEN.2016.2546942
  14. E. Akan, H. Tora, and B. Uslu, "Hand gesture classification using inertial based sensors via a neural network," in 2017 24th IEEE International Conference on Electronics, Circuits and Systems (ICECS), Batumi: Georgia, pp. 140-143, Feb. 2017.
  15. G. Dong and M. Xie, "Color clustering and learning for image segmentation based on neural networks," IEEE Transactions on Neural Networks, Vol. 16, No. 4, pp. 925-936, Jul. 2005. https://doi.org/10.1109/TNN.2005.849822