Real-time Activity and Posture Recognition with Combined Acceleration Sensor Data from Smartphone and Wearable Device

스마트폰과 웨어러블 가속도 센서를 혼합 처리한 실시간 행위 및 자세인지 기법

  • 이호성 (경희대학교 컴퓨터공학과) ;
  • 이승룡 (경희대학교 컴퓨터공학과)
  • Received : 2013.12.17
  • Accepted : 2014.05.23
  • Published : 2014.08.15

Abstract

The next generation mobile computing technology is recently attracting attention that smartphone and wearable device imbedded with various sensors are being deployed in the world. Existing activity and posture recognition research can be divided into two different ways considering feature of one's movement. While activity recognition focuses on catching distinct pattern according to continuous movement, posture recognition focuses on sudden change of posture and body orientation. There is a lack of research constructing a system mixing two separate patterns which could be applied in real world. In this paper, we propose a method to use both smartphone and wearable device to recognize activity and posture in the same time. To use smartphone and wearable sensor data together, we designed a pre-processing method and constructed recognition model mixing signal vector magnitude and orientation pattern features of vertical and horizontal. We considered cycling, fast/slow walking and running activities, and postures such as standing, sitting, and laying down. We confirmed the performance and validity by experiment, and proved the feasibility in real world.

최근 고성능 센서가 집적된 스마트폰과 웨어러블 디바이스 기술이 부각됨에 따라 이와 같은 플랫폼을 활용한 차세대 모바일 컴퓨팅 기술이 크게 주목받고 있다. 기존 행위인지는 지속적인 움직임에 따른 고유 패턴을 포착하는 반면, 자세인지는 급격한 순간 변화나 신체 방향의 변화를 포착하는 방법으로 접근되어 왔다. 그러나 이 두 가지의 패턴을 함께 고려하고 실제 활용 가능한 수준의 성능 확보와 그 시스템에 대한 연구는 다소 부족한 실정이다. 이에 본 논문에서는 최근 부각되는 스마트폰과 웨어러블 디바이스의 센서 데이터를 함께 고려하고 각각이 갖는 장점을 혼합한 사용자 행위 및 자세인지 기법과 스마트폰 플랫폼을 기반으로 실제 환경에서의 그 활용 방법을 제안한다. 스마트폰과 웨어러블 센서 데이터를 함께 운용하기 위한 전처리 방법을 설계하고 고유 진동 패턴과 수직, 수평 방향 패턴 특징을 혼합적으로 활용하여 인지 모델을 구축하였다. 이 과정에서 자전거 타기와 빠르게, 천천히 걷기, 뛰기와 같이 보다 다양한 행위와 서기, 앉기, 누워있기와 같은 자세 패턴을 고려하였다. 실험 결과 제안하는 기법의 성능과 타당성을 입증하였고 실제 환경에서의 적용을 통해 그 활용 가능성을 보였다.

Keywords

Acknowledgement

Supported by : NIPA(National IT Industry Promotion Agency)

References

  1. Ling Bao, and Stephen S. Intille, "Activity recognition from user-annotated acceleration data," Pervasive Computing. Springer Berlin Heidelberg, 1-17, 2004.
  2. Gjoreski, Hristijan, and Matjaz Gams, "Activity/ Posture recognition using wearable sensors placed on different body locations," Proceeding of signal and image processing and applications, 2011.
  3. Muhammad Fahim, et al., "EFM: Evolutionary Fuzzy Model for Dynamic Activities Recognition using a Smartphone Accelerometer."
  4. Manhyung Han, Young-Koo Lee, and Sungyoung Lee, "Comprehensive Context Recognizer Based on Multimodal Sensors in a Smartphone," Sensors 12.9 (2012): 12588-12605. https://doi.org/10.3390/s120912588
  5. Jun Yang, "Toward physical activity diary: motion recognition using simple acceleration features with mobile phones," Proceedings of the 1st international workshop on Interactive multimedia for consumer electronics, ACM, 2009.
  6. Muhannad Quwaider, Subir Biswas, "Body posture identification using hidden Markov model with a wearable sensor network," Proceedings of the ICST 3rd international conference on Body area networks, ICST (Institute for Computer Sciences, Social- Informatics and Telecommunications Engineering), 2008.
  7. Ozlem Durmaz Incel, Mustafa Kose, and Cem Ersoy, "A Review and Taxonomy of Activity Recognition on Mobile Phones," BioNanoScience, pp.1-27, 2013.
  8. David Mizell, "Using gravity to estimate accelerometer orientation," Proceedings of the Seventh IEEE International Symposium on Wearable Computers (ISWC'03), vol.1530, no.0811/03, 2003.
  9. Joana Raquel Cerqueira da Silva, "Smartphone Based Human Activity Prediction," MS. Thesis, University of Porto, 2013.
  10. Xiuxin Yang, "A Wearable Real-Time System for Physical Activity Recognition and Fall Detection," MS. Thesis, University of Saskatchewan Saskatoon, 2010.
  11. C. M. Chung, et al., "Smartphone based activity recognition for real environment," The Korean Institute of Information Scientists and Engineers, Proceedings of Korea Computer, pp.460-462, 2013.