Browse > Article
http://dx.doi.org/10.15701/kcgs.2018.24.4.21

Interaction Technique in Smoke Simulations using Mouth-Wind on Mobile Devices  

Kim, Jong-Hyun (Kangnam University)
Abstract
In this paper, we propose a real-time interaction method using user's mouth wind in mobile device. In mobile and virtual reality, user interaction technology is important, but various user interface methods is still lacking. Most of the interaction technologies are hand touch screen touch or motion recognition. In this study, we propose an interface technology that can interact with real time using user's mouth wind. The direction of the wind is determined by using the angle and the position between the user and the mobile device, and the size of the wind is calculated by using the magnitude of user's mouth wind. To show the superiority of the proposed technique, we show the result of visualizing the flow of the vector field in real time by integrating the mouth-wind interface into the Navier-Stokes equations. We show the results of the paper on mobile devices, but can be applied in the Agumented reality(AR) and Virtual reality(VR) fields requiring interface technology.
Keywords
Mobile interaction; User interface; Mouth wind; Fluid simulations;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Facebook. (2018) Oculus go. [Online]. Available: https://developer.oculus.com/documentation/
2 W. Zhao, J. Chai, and Y.-Q. Xu, "Combining markerbased mocap and rgb-d camera for acquiring high-fidelity hand motion data," in Proceedings of the 11th ACM SIGGRAPH/ Eurographics conference on Computer Animation, 2012, pp. 33-42.
3 M. Buckwald. (2014) Leap motion. [Online]. Available: https://developer.leapmotion.com/documentation/
4 P. Trotta. (2018) Captoglove. [Online]. Available: https://www.captoglove.com/downloads/
5 V. Omni. (2018) Cyberith virtualizer. [Online]. Available: https://www.cyberith.com/research-development/
6 T. Cakmak and H. Hager, "Cyberith virtualizer: a locomotion device for virtual reality," in ACM SIGGRAPH 2014 Emerging Technologies, 2014, p. 6.
7 K. Hinckley, J. Pierce, M. Sinclair, and E. Horvitz, "Sensing techniques for mobile interaction," in Proceedings of the 13th annual ACM symposium on User interface software and technology, 2000, pp. 91-100.
8 D.-Y. Kim, O.-Y. Song, and H.-S. Ko, "Interactive fluid simulation method for mobile device," Journal of the HCI Society of Korea, 2009.
9 S.-H. Woo, M. Cho, and D.-G. Park, "Realtime fluid simulation and rendering using billboard method on mobile environment," in Proceedings of the Korea Contents Association Conference. The Korea Contents Association.
10 J.-J. Choi, "2d animation system for mobile environment," Journal of Korea Computer Graphics Society, pp. 49-61, 2007.
11 I. E. Sutherland, "A head-mounted three dimensional display," in Proceedings of the December 9-11, 1968, fall joint computer conference, part I, 1968, pp. 757-764.
12 C. Schissler, A. Nicholls, and R. Mehra, "Efficient hrtf-based spatial audio for area and volumetric sources," IEEE transactions on visualization and computer graphics, vol. 22, no. 4, pp. 1356-1366, 2016.   DOI
13 H. G. Hoffman, "Physically touching virtual objects using tactile augmentation enhances the realism of virtual environments," in Virtual Reality Annual International Symposium, 1998. Proceedings., IEEE 1998, 1998, pp. 59-63.
14 C. Carvalheiro, R. Nobrega, H. da Silva, and R. Rodrigues, "User redirection and direct haptics in virtual environments," in Proceedings of the 2016 ACM on Multimedia Conference, 2016, pp. 1146-1155.
15 T. Labs. (2014) Myo gesture control armband. [Online]. Available: https://support.getmyo.com/hc/enus/categories/200376235-Developing-With-Myo
16 S. Rawat, S. Vats, and P. Kumar, "Evaluating and exploring the myo armband," in System Modeling & Advancement in Research Trends (SMART), International Conference, 2016, pp. 115-120.
17 J. Stam, "Stable fluids," in Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques, ser. ACM SIGGRAPH, 1999, pp. 121-128.