DOI QR코드

DOI QR Code

Development of a Web Platform System for Worker Protection using EEG Emotion Classification

뇌파 기반 감정 분류를 활용한 작업자 보호를 위한 웹 플랫폼 시스템 개발

  • Ssang-Hee Seo (School of Computer Science and Engineering, Kyungnam University)
  • 서쌍희 (경남대학교 컴퓨터공학부)
  • Received : 2023.09.16
  • Accepted : 2023.10.31
  • Published : 2023.12.31

Abstract

As a primary technology of Industry 4.0, human-robot collaboration (HRC) requires additional measures to ensure worker safety. Previous studies on avoiding collisions between collaborative robots and workers mainly detect collisions based on sensors and cameras attached to the robot. This method requires complex algorithms to continuously track robots, people, and objects and has the disadvantage of not being able to respond quickly to changes in the work environment. The present study was conducted to implement a web-based platform that manages collaborative robots by recognizing the emotions of workers - specifically their perception of danger - in the collaborative process. To this end, we developed a web-based application that collects and stores emotion-related brain waves via a wearable device; a deep-learning model that extracts and classifies the characteristics of neutral, positive, and negative emotions; and an Internet-of-things (IoT) interface program that controls motor operation according to classified emotions. We conducted a comparative analysis of our system's performance using a public open dataset and a dataset collected through actual measurement, achieving validation accuracies of 96.8% and 70.7%, respectively.

인터스트리4.0의 주요 기술인 인간-로봇 협업은 작업자의 안전을 보장하기 위한 추가적인 조치들이 필요하다. 협동로봇과 작업자간 충돌을 회피하는 기존 방식은 주로 로봇에 부착된 센서와 카메라를 기반으로 총돌을 탐지한다. 이러한 방식은 로봇, 사람 물체를 지속적으로 추적하고 충돌회피를 위한 복잡한 알고리즘이 필요하며, 작업 환경 변화에 빠르게 대응하지 못하는 단점이 있다. 본 논문은 인간과 로봇이 협업하는 과정에서 작업자가 위험을 느낄 때의 감정을 인식하여 협동로봇과의 충돌을 방지할 수 있는 웹 기반 플랫폼을 개발하였다. 이를 위해 웨어러블 뇌파장치를 이용하여 감정 관련 뇌파를 수집하고 저장하는 웹 기반 애플리케이션을 개발하였으며, 중립/긍정/부정 감정의 특징을 추출하고 분류하는 딥러닝 모델을 제안하였다. 또한 분류된 감정에 따라 모터동작을 제어하는 사물인터넷 인터페이스 프로그램을 개발하였다. 구현된 시스템의 성능분석을 위해 공개 데이터세트와 실제 수집된 데이터 세트를 사용하여 제안한 딥러닝 모델의 성능을 분석하였다. 공개 데이터 세트의 경우 정확도는 96.8%이며, 실제 수집 데이터세트의 경우 정확도는 70.7%이다.

Keywords

References

  1. A.Keshvarparast, D.Battini, O.Battaia and A.Pirayesh, "Collaborative robots in manufacturing and assembly systems: literature review and future research agenda," Journal of Intelligent Manufacturing, 2023. 
  2. IFR(2021) IFR presents World Robotics 2021 reports[Internet]. http://ifr.org/ifr-press-releases/news/robot-sales-rise-agin. 
  3. ISO(2016) Robots and Robotic Devices - Collaborative Robots(ISO-15066:2016)[Internet]. http://www.iso.org/standard/62996.html 
  4. T.Koch and B.Soltani, "Safeguarding of an automated assembly process using a balanced decoupling unit and the HRC switching mode," Procedia CIRP. Vol.81, pp.328-333, 2019.  https://doi.org/10.1016/j.procir.2019.03.057
  5. M.Safeea, P.Neto and R. Bearee, "A Quest Towards Safe Human Robot Collaboration," in Towards Autonomous Robotic Systems, Lecture Notes in Computer Science, Ed., Springer International Pub., Cham, pp.493-495, 2019. 
  6. J.H.Chen and K.T.Song, "Collision-free motion planning for human-robot collaborative safety under cartesian constraint," IEEE International Conference on Robotics and Automation, pp.1-7, 2018.. 
  7. C.T.Landi, F.Ferraguti, S.Costi, M.Bonfe and C.Secchi, "Safety barrier functions for human-robot interaction with industrial manipulator," 18th European Control Conference, Naples, pp.2565-2570, 2019. 
  8. J.Liu, G.Wu, Y.Luo, S.Qiu, S.Yang, W.Li and Y.Bi, "EEG-Based Emotion Classification Using a Deep Neural Network and Sparse Autoencoder," Frontiers in Systems Neuroscience, Vol.14, pp.43, 2020. 
  9. S.Lokesh and T.S.Reddy, "An effective optimized deep learning for emotion classification from EEG signals," Signal, Image and Video Processing, Vol.17, pp.1631-1642, 2023.  https://doi.org/10.1007/s11760-022-02373-2
  10. Q.Gao, Y.Yang, Q.Kang, Z.Tian and Y.Song, "EEG-based emotion recognition with feature fusion networks," Internationsl Journal of Machine Learning and Cybernetics, Vol.13, pp.421-429, 2022.  https://doi.org/10.1007/s13042-021-01414-5
  11. M.Hasan, Rokhshana-Nishat-Anzum, S.Yasmin and T.S.Pias, "Fine-grained emotion recognition from EEG signal using fast fourier transformation and CNN," Joint 10th International Conference on Informatics, Electronics & Vision and 5th International Conference on Imaging, Vision & Pattern Recognition, Kitakyushu, pp.1-9, 2021. 
  12. R.Alhalaseh and S.Alasafeh, "Machine-learning-based emotion recognition system using EEG signals," Computers, Vol.9, No.4, pp.95, 2020. 
  13. Y.Yin, X.Zheng, B.Hu, Y.Zhang and X.Cui, "EEG emotion recognition using fusion model of graph convolutional neural networks and LSTM," Applied Soft Computing, Vol. 100, pp. 106954, 2021. 
  14. P.J.Lang, M.M.Bradley and B.N.Cuthbert, International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual. Technical Report A-6, Gainesville, FL: University of Florida, 2005. 
  15. M.L.Dixon, R.Thiruchselvam, R.Todd and K.Christoff, "Emotion and the prefrontal cortex: An integrative review," Psychological Bulletin, Vol.143, No.10, pp.1033-1081, 2017. https://doi.org/10.1037/bul0000096