Browse > Article
http://dx.doi.org/10.9708/jksci.2021.26.02.125

A Study on the Gesture Based Virtual Object Manipulation Method in Multi-Mixed Reality  

Park, Sung-Jun (Company of DataReality)
Abstract
In this paper, We propose a study on the construction of an environment for collaboration in mixed reality and a method for working with wearable IoT devices. Mixed reality is a mixed form of virtual reality and augmented reality. We can view objects in the real and virtual world at the same time. And unlike VR, MR HMD does not occur the motion sickness. It is using a wireless and attracting attention as a technology to be applied in industrial fields. Myo wearable device is a device that enables arm rotation tracking and hand gesture recognition by using a triaxial sensor, an EMG sensor, and an acceleration sensor. Although various studies related to MR are being progressed, discussions on developing an environment in which multiple people can participate in mixed reality and manipulating virtual objects with their own hands are insufficient. In this paper, We propose a method of constructing an environment where collaboration is possible and an interaction method for smooth interaction in order to apply mixed reality in real industrial fields. As a result, two people could participate in the mixed reality environment at the same time to share a unified object for the object, and created an environment where each person could interact with the Myo wearable interface equipment.
Keywords
Mixed Reality; Multi-Person; Myo Wearable device; Cooperation; Interaction Gesture;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 Maximilian Spicher, Brian D.Hall, Michael Nebeling, "What is Mixed Reality?," CHI Conference, No.537, pp.1-15, May 2019, DOI:10.1145/3290605.3300767   DOI
2 Chunfern Xu, Yange Wang, Wei Quan, He Yang, "Multi-person Collaborative Interaction Algorithm and Application Based on Hololens," Springer AISC, Vol. 1006, pp 303-315, Oct. 2019
3 Rafael Radkowski, "Hololens Integration into a Multi-Kinnect Tracking Environment," IEEE, April, 2019, DOI:10.1109/ISMAR-Adunct.2018.00052
4 P.Hubner, M.Weinmann, S.Wursthorn, "MARKER-BASED LOCALIZATION OF THE MICROSOFT HOLOELENS IN BUILDING MODELS," Vol.XLII-1, pp.195-202, DOI:10.5194/isprs-archives-XLII-1-195-2018
5 M.Ostanin, R.Yagfarov, A.Klimchik, "Interactive Robots Control Using Mixed Reality," IFAC, pp.695-700, 2019, DOI:10.1016/j.ifacol.2019.11.307   DOI
6 Muun Wu, Yanbin Xu, Chenguang Yang, Ying Feng, "Omnidirectinal Mobile Robot Control based on Mixed Reality and sEMG Signals," IEEE Xplore, January 2019, DOI:10.1109/CAC.2018.8623114   DOI
7 Mulun Wu, Shi-Lu Dai, and Chenguang Yang, "Mixed Reality Enhanced User Interaction Path Planning for Omnidirectional Mobile Robot," Applied Sciences, Vol, DOI:10.3390/app10031135   DOI
8 Tobias Mulling, Mithileysh Sathiyanarayanan, "Characteristics of Hand Gesture Navigation: a case study using a wearable device(MYO)," British HCI'15:Proceedings of the 2015 British HCI Conference, pp.283-284, July 2015, DOI:10.1145/2783446.2783612   DOI
9 Lee Stearns, Victor DeSouza, Jessica Yin, Leah Findlater, Jon E. Froehlich, "Augmented Reality Magnification for Low Vision Users with the Microsoft Hololens and a Finger-Worn Camera," ACM SGACCESS Conference, pp.361-362, Oct. 2017, DOI:10.1145/3132525.3134812   DOI
10 Will Guest, Fridolin Wild, Alla Vovk, Mikhail Fominykh, Bibeg Limbu, Roland Klemke, Puneet Sharma, jaakko Karjalainen, Carl Smith, Jazz Rasool, Soyeb Aswat, Kajh Helin, Daniele Di Mitri, Jan Schneider, "Affordances for Capturing and Re-enacting Expert Performance with Wearables," Springer LNCS, Vol. 10474, pp. 403-409, Sep. 2017.
11 Kdhong, "An Efficient Dynamic Workload Balancing Strategy," Journal of The Korea Society of Computer and Information, Vol. 15, No. 1, pp. 1-10, Nov. 2010.   DOI
12 Kei Sato, Syunta Sato, "Pedestrian Navigation System for Visually Impaired People Using HoloLens and RFID," IEEE Xplore, 2017, DOI:10.1109/TAAI.2017.9   DOI