Browse > Article
http://dx.doi.org/10.14372/IEMEK.2021.16.5.163

Implementation and Verification of Deep Learning-based Automatic Object Tracking and Handy Motion Control Drone System  

Kim, Youngsoo (Jeonju University)
Lee, Junbeom (Korea Air Force Academy)
Lee, Chanyoung (Korea Air Force Academy)
Jeon, Hyeri (Korea Air Force Academy)
Kim, Seungpil (Korea Air Force Academy)
Publication Information
Abstract
In this paper, we implemented a deep learning-based automatic object tracking and handy motion control drone system and analyzed the performance of the proposed system. The drone system automatically detects and tracks targets by analyzing images obtained from the drone's camera using deep learning algorithms, consisting of the YOLO, the MobileNet, and the deepSORT. Such deep learning-based detection and tracking algorithms have both higher target detection accuracy and processing speed than the conventional color-based algorithm, the CAMShift. In addition, in order to facilitate the drone control by hand from the ground control station, we classified handy motions and generated flight control commands through motion recognition using the YOLO algorithm. It was confirmed that such a deep learning-based target tracking and drone handy motion control system stably track the target and can easily control the drone.
Keywords
Deep learning; Handy motion control; Object detection and tracking; Intelligent drone system;
Citations & Related Records
연도 인용수 순위
  • Reference
1 S.Y. Lim, T.Y. Kim, "Recognizing and Tracking Moving Objects with Aerial Drones Using Video Calculation", Proceedings of Korean association of computer education, pp. 109-113, 2017.
2 J.M. Cho, "Application Trends of Deep Learning Artificial Intelligence in Autonomous Things", ETRI Electronics and telecomm. trends, Vol. 35, No. 6, pp. 1-11, 2020.
3 N. Wojke, A. Bewley, D. Paulus, "Simple online and realtime tracking with a deep association metric", Int. Conf. on Image Processing (ICIP), pp. 3645-3649, 2017.
4 W. J. Jang, H. Kim, "Development of an Intuitive Drone Controller Based on Hand Motions", Proceedings of HCI Korea conference, pp. 128-131, 2018.
5 K. Amer, M. Samy, M. Shaker, M. ElHelw, "Deep Convolutional Neural Network-based Autonomous Drone navigation", arXiv:1905.01657, 2019.
6 T.Y. Lin, M. Maire, S. Belongie, J. Hays, P. Perona, D. Ramanan, P. Dollar, C.L. Zitnick, "Microsoft coco: Common objects in context", In European conference on computer vision, Springer, Cham, pp. 740-755, 2014.
7 Y.J. Yoo, Y.J. Kim, "A Study of Gesture Vocabularies for User Centered Drone Control Interface", Proceedings of HCI Korea conference, pp. 446-469, 2017.
8 A.G. Howard, M. Zhu, B. Chen, D. Kalenichenko, W. Wang, T. Weyand, M. Andreetto, H. Adam, "MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications", arXiv:1704.04861, 2017.
9 Redmon Joseph, Ali Farhadi, "YOLOv3: An Incremental Improvement," arXiv preprint arXiv:1804.02767, 2018..
10 D.J. Park, J.S. Kim, "Implementation of Object Tracking System for Drone Using TLD Image Tracking Algorithm", Proceedings of the institute of electronics and information engineers, pp. 1460-1461, 2018.
11 Y.J. Park, H.J. Kim, H. Lee, Y.J. Baek, D.G. Kim, O.K. Ha, "A Design of Method for Drone Control using Finger Motion Recognition", Proceedings of Korea society of computer information, pp. 127-128, 2020.