DOI QR코드

DOI QR Code

출입 통제에 활용 가능한 딥러닝 기반 마스크 착용 판별

Deep learning based face mask recognition for access control

  • 이승호 (한국기술교육대학교 융합학과)
  • Lee, Seung Ho (Department of Future Technologies, Korea University of Technology and Education)
  • 투고 : 2020.05.22
  • 심사 : 2020.08.07
  • 발행 : 2020.08.31

초록

전 세계적으로 유행하며 수많은 확진자와 사망자를 발생시킨 코로나바이러스-19(COVID-19)는 일상에서 사람 간 전염이 가능하여 국민들을 불안과 공포에 떨게 하고 있다. 감염을 최소화하기 위해서는 건물 출입시 마스크 착용이 필수적이지만 일부 사람들은 여전히 마스크 없이 얼굴을 노출시킨 채 건물에 출입하고 있다. 본 논문에서는 효율적인 출입 통제를 위해 얼굴에 마스크를 착용했는지 여부를 자동으로 판별하는 방법을 제안한다. 제안 방법은 양쪽 눈 영역을 검출하고 눈 위치를 참조하여 마스크 착용 영역(양쪽 눈 아래 얼굴 영역)을 예측한다. 이 때 마스크 착용 영역을 보다 정확히 예측하기 위해 양쪽 눈 위치가 수평이 되도록 얼굴 영역을 회전하여 정렬한다. 정렬된 얼굴 영역에서 추출된 마스크 착용 영역은 이미지 분석에 특화된 딥러닝 기법인 CNN(Convolutional neural network)을 통해 마스크 착용 여부(착용 또는 미착용)를 최종 판별한다. 총 186장의 테스트 이미지에 대해 실험한 결과, 98.4%의 판별 정확도를 보였다.

Coronavirus disease 2019 (COVID-19) was identified in December 2019 in China and has spread globally, resulting in an ongoing pandemic. Because COVID-19 is spread mainly from person to person, every person is required to wear a facemask in public. On the other hand, many people are still not wearing facemasks despite official advice. This paper proposes a method to predict whether a human subject is wearing a facemask or not. In the proposed method, two eye regions are detected, and the mask region (i.e., face regions below two eyes) is predicted and extracted based on the two eye locations. For more accurate extraction of the mask region, the facial region was aligned by rotating it such that the line connecting the two eye centers was horizontal. The mask region extracted from the aligned face was fed into a convolutional neural network (CNN), producing the classification result (with or without a mask). The experimental result on 186 test images showed that the proposed method achieves a very high accuracy of 98.4%.

키워드

참고문헌

  1. P. Viola, M. J. Jones, "Robust-real time face detection," International Journal of Computer Vision, Vol.57, No.2, pp.137-154, May 2004. DOI: https://doi.org/10.1023/B:VISI.0000013087.49260.fb
  2. Newspim.com [cited 2020 July 6], Available From: http://www.newspim.com/news/view/20200311000350 (accessed Jul. 6, 2020)
  3. Y. Lecun, L. Bottou, Y. Bengio, P. Haffner, "Gradient based learning applied to document recognition," Proceedings of the IEEE, Vol.86, No.11, pp.2278-2324, Nov. 1998. DOI: https://doi.org/10.1109/5.726791
  4. Haar-cascade classifier [cited 2020 July 6], Available From: https://docs.opencv.org/3.4/db/d28/tutorial_cascade_classifier.html (accessed Jul. 6, 2020)
  5. OpenCV-Python [cited 2020 July 6], Available From: https://docs.opencv.org/4.3.0/ (accessed Jul. 6, 2020)
  6. Z. Wang, G. Wang, B. Huang, Z. Xiong, Q. Hong, H. Wu, P. Yi, K. Jiang, N. Wang, Y. Pei, et al., "Masked face recognition dataset and application," arXiv preprint, arXiv:2003.09093, Mar. 2020.
  7. RMFD Download [cited 2020 July 6], Available From: https://github.com/X-zhangyang/Real-World-Masked-Face-Dataset (accessed Jul. 6, 2020)
  8. GTAV Face Database [cited 2020 July 6], Available From: https://francesctarres.wordpress.com/gtav-face-database/ (accessed Jul. 6, 2020)
  9. Viola-Jones face detector in Python [cited 2020 July 6], Available From: https://docs.opencv.org/2.4/modules/objdetect/doc/cascade_classification.html (accessed Jul. 6, 2020)
  10. A. Krizhevsky, I. Sutskever, G. E. Hinton, "Imagenet classification with deep convolutional neural networks," In Advances in Neural Information Processing Systems, pp.1097-1105, 2012. DOI: https://doi.org/10.1145/3065386
  11. B. Xu, N. Wang, T. Chen, M. Li, "Empirical evaluation of rectified activations in convolutional network," arXiv preprint arXiv:1505.00853, Nov. 2015.
  12. Wikipedia- Softmax function [cited 2020 July 6], Available From: https://en.wikipedia.org/wiki/Softmax_function (accessed Jul. 6, 2020)
  13. I. Kandel, M. Castelli, "The effect of batch size on the generalizability of the convolutional neural networks on a histopathology dataset," ICT Express (available online) May 2020. DOI: https://doi.org/10.1016/j.icte.2020.04.010