Browse > Article
http://dx.doi.org/10.5762/KAIS.2020.21.9.25

Display of Irradiation Location of Ultrasonic Beauty Device Using AR Scheme  

Kang, Moon-Ho (Department of Information and Communication Engineering, Sunmoon University)
Publication Information
Journal of the Korea Academia-Industrial cooperation Society / v.21, no.9, 2020 , pp. 25-31 More about this Journal
Abstract
In this study, for the safe use of a portable ultrasonic skin-beauty device, an android app was developed to show the irradiation locations of focused ultrasound to a user through augmented reality (AR) and enable stable self-surgery. The utility of the app was assessed through testing. While the user is making a facial treatment with the beauty device, the user's face and the ultrasonic irradiation location on the face are detected in real-time with a smart-phone camera. The irradiation location is then indicated on the face image and shown to the user so that excessive ultrasound is not irradiated to the same area during treatment. To this end, ML-Kit is used to detect the user's face landmarks in real-time, and they are compared with a reference face model to estimate the pose of the face, such as rotation and movement. After mounting a LED on the ultrasonic irradiation part of the device and operating the LED during irradiation, the LED light was searched to find the position of the ultrasonic irradiation on the smart-phone screen, and the irradiation position was registered and displayed on the face image based on the estimated face pose. Each task performed in the app was implemented through the thread and the timer, and all tasks were executed within 75 ms. The test results showed that the time taken to register and display 120 ultrasound irradiation positions was less than 25ms, and the display accuracy was within 20mm when the face did not rotate significantly.
Keywords
Ultrasound skin-beauty device; Android app; Augmented reality; Face landmark detection; Face pose estimation;
Citations & Related Records
연도 인용수 순위
  • Reference
1 G. T. Haar and C. Coussios, "High intensity focused ultrasound: Physical principles and devices", International Journal of Hyper-thermia, vol. 23, no. 2, pp. 89-104, 2007. DOI: https://doi.org/10.1109/LSP.2014.2338911   DOI
2 D. Schmalstieg and T. Hollerer, Augmented Reality, Principles and Practice, Addison-Wesley, pp. 78-84, 2016.
3 S. Aukstakalnis, Practical Augmented Reality: A Guide to the Technologies, Applications, and Human Factors for AR and VR, Addison-Wesley, pp. 227-329, 2016.
4 Karthkeyan NG, Machine Learning Projects for Mobile Applications, pp. 85-108, Packt, 2018.
5 A. Kumar et al., "Face detection techniques: a review", Artificial Intelligence Review, 52, pp. 927-948, 2019. DOI: https://doi.org/10.1007/s10462-018-9650-2   DOI
6 J. Shin and D. Kim, "Hybrid Approach for Facial Feature Detection and Tracking under Occlusion", IEEE Signal Processing Letters, vol. 21, no. 12, pp. 1486-1490, Dec. 2014. DOI: https://doi.org/10.1109/LSP.2014.2338911   DOI
7 D. Han et al., "Design and Implementation of Real-time High Performance Face Detection Engine", The Institute of Electronics Engineers of Korea-SP, vol. 47, no. 2, pp. 33-44, 2010.
8 P. Viola and M. Jones, "Rapid object detection using a boosted cascade of simple features", Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 511-518, 2001. DOI: https://doi.org/10.1109/LSP.2014.2338911
9 D. Garg et al., "A Deep Learning Approach for Face Detection using YOLO", IEEE Punecon, pp. 1-4, 2018. DOI: https://doi.org/10.1109/LSP.2014.2338911
10 A. Rosebrock, Facial landmarks with dlib, OpenCV, and Python, https://www.pyimagesearch.com/2017/04/03/facial-la ndmarks-dlib-opencv-python/ 2017.
11 R. Laganiere, OpenCV Computer Vision Application Programming Cookbook, 2nd Edition, Packt, pp. 281-313, 2014.