DOI QR코드

DOI QR Code

실내 가상 경기를 위한 햅틱 AR 스포츠 기술

Haptic AR Sports Technologies for Indoor Virtual Matches

  • 발행 : 2021.08.01

초록

Outdoor sports activities have been restricted by serious air pollution, such as fine dust and yellow dust, and abnormal meteorological change, such as heatwave and heavy snow. These environmental problems have rapidly increased the demand for indoor sports activities. Virtual sports, such as virtual golf, virtual baseball, virtual soccer, etc., allow playing various sports games without going outdoors. Indoor sports industries and markets have seen rapid growth since the advent of virtual sports. Most virtual sports platforms use screen-based virtual reality techniques, which are why they are called screen sports. However, these platforms cannot support various sports games, especially virtual match games, such as squash, boxing, and so on, because existing screen-based virtual reality sports techniques use real balls and players. This article presents screen-based haptic-augmented reality technologies for a new virtual sports platform. The new platform does not use real balls and players to solve the limitations of previous platforms. Here, various technologies, including human motion tracking, human action recognition, haptic feedback, screen-based augmented-reality systems, and augmented-reality sports content, are unified for the new virtual sports platform. From these haptic-augmented reality technologies, the proposed platform supports sports games, including indoor virtual matches, that existing virtual sports platforms cannot support.

키워드

과제정보

본 연구는 문화체육관광부 및 한국콘텐츠진흥원의 연구개발지원사업으로 수행되었음[과제번호: R2020040036, 간접 센싱 기반 실시간 연동 AR 실내 스포츠 플랫폼 개발].

참고문헌

  1. AirVisual, "2019 world air quality report," 2020.
  2. 환경부, "향후 10년 우리나라 폭염 위험도 더욱 높아진다," 대한민국 정책브리핑, 2019. 8. 1.
  3. 한겨레, "코로나냐 황사냐... 봄철 실내 환기, 그것이 문제로다," 2021. 3. 16.
  4. 장경로 외, "가상현실 스포츠에서 감각적 리얼리티와 인지적 리얼리티가 즐거움과 유용성 및 고객가치에 미치는 영향: 스크린 골프를 대상으로," 한국체육학회, 제58권, 제2호, 2019, pp. 287-306.
  5. 한국경제, "스크린 스포츠에 빠진 대한민국, 골프.야구.볼링... '5兆'시장 됐다," 2019. 2. 15.
  6. 김종성 외, "간접 센싱 기반 실시간 연동 AR 실내 스포츠 플랫폼 개발," 한국전자통신연구원, 1년차 보고서, 2020. 11.
  7. P. L. Rosin et al., "RGB-D image analysis and processing," in Advances in Computer Vision and Pattern Recognition, Springer, Cham, Switzerland, 2019.
  8. G. Welch and E. Foxlin, "Motion tracking: No silver bullet, but a respectable arsenal," IEEE Comput. Graphics Appl., vol. 22, no. 6, 2002, pp. 24-38.
  9. C. Bregler, "Motion capture technology for entertainment," IEEE Signal Process. Mag., vol. 24, no. 6, 2007, pp. 160-168. https://doi.org/10.1109/MSP.2007.906023
  10. A. Filippeschi et al., "Survey of motion tracking methods based on inertial sensors: A focus on upper limb human motion," MDPI Sensors, vol. 17, no. 6, 2017.
  11. N. Sarafianos et al., "3D human pose estimation: A review of the literature and analysis of covariates," Comput. Vis. Image Underst., vol. 152, 2016, pp. 1-20. https://doi.org/10.1016/j.cviu.2016.09.002
  12. Xsens, https://www.xsens.com
  13. Vicon, https://www.vicon.com
  14. PhaseSpace, https://www.phasespace.com
  15. OptiTrack, https://optitrack.com
  16. J. S. Kim and J. H. Kim, "Digital special makeup by onset motion capture," in Proc. Int. Conf. Advanc. Comm. Tech., Feb. 2011, pp. 782-785.
  17. M. Ye et al., "A survey on human motion analysis from depth data," in Time-of-Flight and Depth Imaging, vol. 8200, Springer, Berlin, Germany, 2013. pp. 149-187.
  18. Azure Kinect, https://azure.microsoft.com
  19. RealSense, https://www.intelrealsense.com
  20. Xtion Pro Live, https://xtionprolive.com
  21. D. R. Beddiar et al,, "Vision-based human activity recognition: A survey," Multimed. Tools. Appl. vol. 79, 2020, pp. 30509-30555. https://doi.org/10.1007/s11042-020-09004-3
  22. K. Soomro and A. R. Zamir, "Action recognition in realistic sports videos," in Computer Vision in Sports, Springer, Cham, Switzerland, 2015, pp. 181-208.
  23. E. Sansano et al., "A study of deep neural networks for human activity recognition," Comput. Intell. vol. 36, no. 3, 2020, pp. 1113-1139. https://doi.org/10.1111/coin.12318
  24. C. Direkoglu and N. E. O'Connor, "Team activity recognition in sports," in Proc. European Conf. Comput. Vis., Florence, Italy, Oct. 2012, pp. 69-83.
  25. A. Krizhevsky et al., "Imagenet classification with deep convolutional neural networks," in Proc. Neural Inf. Process. Syst. NV, USA, 2012, pp. 1106-1114.
  26. C. Szegedy et al., "Going deeper with convolutions," in Proc. IEEE Comput. Soc. Conf. Compt. Vis. Pattern Recognit., Boston, MA, USA, June 2015, pp. 1-9.
  27. K. Simonyan and A. Zisserman, "Very deep convolutional networks for large-scale image recognition," arXiv, CoRR, 2014, arXiv:1409.1556.
  28. K. He et al., "Deep residual learning for image recognition," arXiv, CoRR, 2015, arXiv:1512.03385.
  29. P. E. Martin et al., "Sports action recognition with Siamese spatio-temporal CNNs: Application to table tennis," in Proc. Int. Conf. Content-Based Multimed. Indexing, La Rochelle, France, Sept. 2018.
  30. K. Rangasamy et al., "Hockey activity recognition using pretrained deep learning," ICT Express, vol. 6, no. 3, 2020, pp. 170-174. https://doi.org/10.1016/j.icte.2020.04.013
  31. P. Wang et al., "RGB-D-based human motion recognition with deep learning: A survey," arXiv, CoRR, 2018, arXiv:1711.08362.
  32. S. Yan et al., "Spatial temporal graph convolutional networks for skeleton-based action recognition," in Proc. AAAI Conf. Artif. Intell., New Orleans, LA, USA, Feb. 2018.
  33. L. Shi et al., "Skeleton-based action recognition with multi- stream adaptive graph convolutional networks," arXiv, CoRR, 2019, arXiv:1912.06971.
  34. C. Park et al., "Realistic haptic rendering of collision effects using multimodal vibrotactile and impact feedback," in Proc. IEEE World Haptics Conf., Tokyo, Japan, July 2019, pp. 449-454.
  35. S. Oh et al., "VibEye: Vibration-mediated object recognition for tangible interactive applications," in Proc. CHI Conf. Human Fact. Comput. Syst., Glasgow, Scotland, May 2019, pp. 1-12.
  36. J. Kim et al., "Body-penetrating tangible phantom sensations," in Proc. CHI Conf. Human Fact. Comput. Syst., Honolulu, HI, USA, Apr. 2020, pp. 1-13.
  37. A. Raza et al., "Perceptually correct haptic rendering in mid-air using ultrasound phased array," IEEE Trans. Indust. Elect., vol. 67, no. 1, 2020, pp. 1-13.
  38. T. Carter et al., "UltraHaptics: Multi-point mid-air haptic feedback for touch surfaces," in Proc. ACM Symp. User Interface Softw. Tech., St. Andrews Scotland, UK, Oct. 2013, pp. 505-514.
  39. Immersion, https://www.immersion.com
  40. M. Sinclair et al., "CapstantCrunch: A haptic VR controller with user-supplied force feedback," in Proc. ACM Symp. User Interface Softw. Tech., New Orleans, LA, USA, Oct. 2019, pp. 815-829.
  41. S. Muthukumarana et al., "CricketCoach: Towards creating a better awareness of gripping forces for crickets," in Proc. Augmented Human Int. Conf., Reims, France, Mar. 2019, pp. 1-2.
  42. K-live X, https://mrsports.modoo.at
  43. 피디케이리미티드, https://pdklimited.com
  44. 백희원 외, "어린이를 위한 체감형 스포츠 게임 요소 제안," 한국컴퓨터정보학회 하계학술대회 논문집, 2018, pp. 491-494.
  45. 에어패스, https://www.airpass.co.kr
  46. Lu, https://play-lu.com
  47. Visual sports, https://www.fitness-gaming.com