과제정보
이 논문은 2024 년도 정부(산업통상자원부)의 재원으로 한국산업기술기획평가원의 지원(No. RS-2024-00406121, 자동차보안취약점기반위협분석시스템개발(R&D))과 정부(과학기술정보통신부)의 재원으로 한국연구재단의 지원(No. RS-2022-00166529)을 받고 과기정통부 정보통신기획평가원의 정보보호핵심원천기술개발사업(No. RS-2024-00337414)으로 수행한 결과임.
참고문헌
- T. Springer et al., "On-Device Deep Learning Inference for SoC Architectures," Electronics, 2021.
- T. Zhao et al., "A Survey of Deep Learning on Mobile Devices: Applications, Optimizations, Challenges," Proceedings of the IEEE, 2022.
- J. Yang et al., "On-Device Unsupervised Image Segmentation," ACM/IEEE DAC, 2023.
- M. Hanyao et al., "Edge-assisted On-device Object Detection for Real-time Video Analytics," IEEE INFOCOM, 2021.
- Z. Qin et al., "ThunderNet: Real-time Object Detection on Mobile Devices," ICCV, 2019.
- Y. Cheng et al., "A Survey of Model Compression and Acceleration for Deep Neural Networks," arXiv, 2020.
- Z. Liu et al., "Rethinking the Value of Network Pruning," arXiv, 2019.
- R. Reed, "Pruning algorithms-a survey," IEEE Transactions on Neural Networks, 1993.
- S. Yao et al., "DeepIoT: Compressing Deep Neural Network Structures for Sensing Systems with a Compressor-Critic Framework," arXiv, 2017.
- B. Rokh et al., "A Comprehensive Survey on Model Quantization for Deep Neural Networks in Image Classification," ACM Transactions on Intelligent Systems and Technology, 2023.
- H. Wu et al., "Integer Quantization for Deep Learning Inference: Principles and Empirical Evaluation," arXiv, 2020.
- N. Kim, J. An, "Knowledge Distillation for Traversable Region Detection of LiDAR Scan in Off-Road Environments," Sensors, 2024.
- C. Zhou et al., "EdgeSAM: Prompt-In-the-Loop Distillation for On-Device Deployment of SAM," arXiv, 2024.
- B. Zoph, Q. V. Le, "Neural Architecture Search with Reinforcement Learning," arXiv, 2017.
- A. Burrello et al., "Enhancing Neural Architecture Search With Multiple Hardware Constraints for Deep Learning Model Deployment on Tiny Devices," IEEE Transactions on Emerging Topics in Computing, 2024
- J. Kim et al., "PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation," arXiv, 2021.