DOI QR코드

DOI QR Code

Human Activity Recognition Using Sensor Fusion and Kernel Discriminant Analysis on Smartphones

스마트폰에서 센서 융합과 커널 판별 분석을 이용한 인간 활동 인식

  • Cho, Jung-Gil (Department of Computer Engineering, Sungkyul University)
  • 조정길 (성결대학교 컴퓨터공학과)
  • Received : 2020.03.13
  • Accepted : 2020.05.20
  • Published : 2020.05.28

Abstract

Human activity recognition(HAR) using smartphones is a hot research topic in computational intelligence. Smartphones are equipped with a variety of sensors. Fusing the data of these sensors could enable applications to recognize a large number of activities. However, these devices have fewer resources because of the limited number of sensors available, and feature selection and classification methods are required to achieve optimal performance and efficient feature extraction. This paper proposes a smartphone-based HAR scheme according to these requirements. The proposed method in this paper extracts time-domain features from acceleration sensors, gyro sensors, and barometer sensors, and recognizes activities with high accuracy by applying KDA and SVM. This approach selects the most relevant feature of each sensor for each activity. Our comparison results shows that the proposed system outperforms previous smartphone-based HAR systems.

스마트폰을 이용한 인간 활동 인식은 컴퓨터 지능 분야에서 뜨거운 연구 주제이다. 스마트폰에는 다양한 센서가 장착되어 있다. 이러한 센서의 데이터를 융합하면 응용프로그램에서 많은 활동을 인식할 수 있다. 그러나 이러한 장치는 활용 가능한 센서 수가 제한되기 때문에 리소스가 적으며, 최적의 성능과 효율적인 특징 추출을 달성하기 위해서는 특징 선택 및 분류 방법이 필요하다. 이 논문에서는 이러한 요구사항에 따라 스마트폰-기반 HAR 체계를 제안한다. 이 논문에서 제안된 방법은 가속도 센서, 자이로 센서, 기압 센서에서 시간-도메인 특징을 추출하며, 커널 판별 분석(KDA)과 SVM을 적용하여 높은 정확도로 활동을 인식한다. 이 방법은 각 활동에 대해 각 센서에서 가장 관련성이 높은 특징을 선택한다. 우리의 비교 결과는 제안된 시스템이 이전의 스마트폰-기반 HAR 시스템보다 성능이 우수함을 보여준다.

Keywords

References

  1. W. Liu, X. Li & D. Huang. (2011). A survey on context awareness. CSSS '11, 144-147. DOI: 10.1109/CSSS.2011.5972040
  2. C. L. Y. Siang, G. W. Shin, Y. M. Kim & M. H. Yun. (2018). Human Activity Recognition using Deep Neural Network. Proceeding of HCI Korea 2018, 716-720.
  3. M. Fahim, I. Fatima, S. Lee & Y. T. Park. (2013). Efm: evolutionary fuzzy model for dynamic activities recognition using a smartphone accelerometer. Applied Intelligence, 39(3), 1-14. DOI: 10.1007/s10489-013-0427-7
  4. O. D. Lara & M. A. Labrador. (2012). A mobile platform for real time human activity recognition. CCNC '12, 667-671. DOI: 10.1109/CCNC.2012.6181018
  5. H. Lu, W. Pan, N. D. Lane, T. Choudhury & A. T. Campbell. (2009). SoundSense: Scalable sound sensing for people-centric applications on mobile phones. MobiSys '09, 165-178. DOI: 10.1145/1555816.1555834
  6. L. Liao, D. Fox & H. Kautz. (2007). Extracting places and activities from GPS traces using hierarchical conditional random fields. International Journal of Robotics Research, 26(1), 119-134. https://doi.org/10.1177/0278364907073775
  7. A. Thiagarajan, J. Biagioni, T. Gerlich & J. Eriksson. (2010). Cooperative transit tracking using smart-phones. SenSys '10, 85-98. DOI: 10.1145/1869983.1869993
  8. M. Han, L. T. Vinh, Y. K. Lee & S. Lee. (2012). Comprehensive context recognizer based on multimodal sensors in a smartphone. Sensors, 12(9), 12588-12605. https://doi.org/10.3390/s120912588
  9. Y. E. Ustev, O. D. Incel & C. Ersoy. (2013). User, device and orientation independent human activity recognition on mobile phones: challenges and a proposal. ESSANN 2013, 1427-1436. DOI: 10.1145/2494091.2496039
  10. D. Anguita, A. Ghio, L. Oneto, X. Parra & J. L. Reyes-Ortiz. (2013). A Public Domain Dataset for Human Activity Recognition using Smartphones. In ESANN 2013.
  11. A. M. Khan, M. H. Siddiqi & S. W. Lee. (2013). Exploratory data analysis of acceleration signals to select light-weight and accurate features for real-time activity recognition on smartphones. Sensors, 13(10), 13099-13122. DOI: 10.3390/s131013099
  12. O. Lara & M. Labrador. (2013). A survey on human activity recognition using wearable sensors. IEEE Communications Surveys Tutorials, 15(3), 1192-1209. DOI: 10.1109/SURV.2012.110112.00192
  13. E. H. Shin & N. El-Sheimy. (2002). A new calibration method for strapdown inertial navigation systems. Dieser Beitragist in der zfv 1/2002 erschienen, 41-50.
  14. Google. (2020). Android SensorEvent: Isolating the Force of Gravity Using a Low-Pass Filter. Develops [Online]. developer.android.com/reference/android/hardware/SensorEvent.html/
  15. O. Banos, J. M. Galvez, M. Damas, H. Pomares & I. Rojas. (2014). Window size impact in human activity recognition. Sensors, 14(4), 6474-6499. DOI: 10.3390/s140406474
  16. A. S. Greg Milette. (2012). Professional Android Sensor Programming. New York : John Wiley & Sons.
  17. D. Figo, P. C. Diniz, D. R. Fereira & J. M. P. Cardoso. (2010). Preprocesing techniques for context recognition from acelerometer data. Personal and Ubiquitous Computing, 14(7), 645-662. DOI: 10.1007/s00779-010-0293-9
  18. S. Mika, G. Ratsch, J. Weston, B. Scholkop & K. R. Muller. (1999). Fisher discriminant analysis with kernels. NNSP '99, 41-48. DOI: 10.1109/NNSP.1999.788121
  19. G. Baudat & F. Anouar. (2000). Generalized discriminant analysis using a kernel approach. Neural Computation, 12(10), 2385-2404. DOI: 10.1162/089976600300014980
  20. C. W. Hsu, C. C. Chang & C. J. Lin. (2016). A practical guide to support vector classification. CiteSeerX [Online]. www.csie.ntu.edu.tw/-cjlin/papers/guide/guide.pdf
  21. G. Hackeling. (2017). Mastering Machine Learning with Scikit-learn. Birmingham : Packt Publishing.
  22. J. G. Cho. (2020). A location localization method using Smartphone sensor on a subway. Journal of the Korea Convergence Society, 11(3), 37-43. DOI : 10.15207/JKCS.2020.11.3.037
  23. D. Khongorzul, S. M. Lee & M. H. Kim. (2019). OrdinalEncoder based DNN for Natural Gas Leak Prediction. Journal of the Korea Convergence Society, 10(10), 7-13. DOI : 10.15207/JKCS.2019.10.10.007