Browse > Article
http://dx.doi.org/10.3837/tiis.2019.06.018

Real-Time Visible-Infrared Image Fusion using Multi-Guided Filter  

Jeong, Woojin (Department of Computer Science and Engineering, Hanyang University)
Han, Bok Gyu (Department of Computer Science and Engineering, Hanyang University)
Yang, Hyeon Seok (Department of Computer Science and Engineering, Hanyang University)
Moon, Young Shik (Department of Computer Science and Engineering, Hanyang University)
Publication Information
KSII Transactions on Internet and Information Systems (TIIS) / v.13, no.6, 2019 , pp. 3092-3107 More about this Journal
Abstract
Visible-infrared image fusion is a process of synthesizing an infrared image and a visible image into a fused image. This process synthesizes the complementary advantages of both images. The infrared image is able to capture a target object in dark or foggy environments. However, the utility of the infrared image is hindered by the blurry appearance of objects. On the other hand, the visible image clearly shows an object under normal lighting conditions, but it is not ideal in dark or foggy environments. In this paper, we propose a multi-guided filter and a real-time image fusion method. The proposed multi-guided filter is a modification of the guided filter for multiple guidance images. Using this filter, we propose a real-time image fusion method. The speed of the proposed fusion method is much faster than that of conventional image fusion methods. In an experiment, we compare the proposed method and the conventional methods in terms of quantity, quality, fusing speed, and flickering artifacts. The proposed method synthesizes 57.93 frames per second for an image size of $320{\times}270$. Based on our experiments, we confirmed that the proposed method is able to perform real-time processing. In addition, the proposed method synthesizes flicker-free video.
Keywords
Visible-infrared image fusion; guided image filtering; real-time processing;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
연도 인용수 순위
1 P. Burt and E. Adelson, "The Laplacian Pyramid as a Compact Image Code," IEEE Transactions on Communications, vol. 31, no. 4, pp. 532-540, 1983.   DOI
2 A. Toet, L. J. van Ruyven, and J. M. Valeton, "Merging Thermal And Visual Images By A Contrast Pyramid," Optical Engineering, vol. 28, no. 7, 1989.
3 A. Toet, "Image Fusion by a Ratio of Low-Pass Pyramid," Pattern Recognition Letters, vol. 9, no. 4, pp. 245-253, 1989.   DOI
4 H. Li, B. S. Manjunath, and S. K. Mitra, "Multisensor Image Fusion Using the Wavelet Transform," Graphical Models and Image Processing, vol. 57, no. 3, pp. 235-245, 1995.   DOI
5 J. J. Lewis, R. J. O'Callaghan, S. G. Nikolov, D. R. Bull, and N. Canagarajah, "Pixel-and region-based image fusion with complex wavelets," Information Fusion, vol. 8, no. 2, pp. 119-130, 2007.   DOI
6 B. Jin, Z. Jing, and H. Pan, "Multi-modality image fusion via generalized resize-wavelet transformation," KSII Transactions on Internet and Information Systems, vol. 8, no. 11, pp. 4118-4136, 2014.   DOI
7 F. Nencini, A. Garzelli, S. Baronti, and L. Alparone, "Remote Sensing Image Fusion using the Curvelet Transform," Information Fusion, vol. 8, no. 2, pp. 143-156, 2007.   DOI
8 V.P.S. Naidu, "Image fusion technique using multi-resolution singular value decomposition," Defence Science Journal, vol. 61, no. 5, pp. 479-484, 2011.   DOI
9 Z. Zhou, B. Wang, S. Li, and M. Dong, "Perceptual Fusion of Infrared and Visible Images through a Hybrid Multi-Scale Decomposition with Gaussian and Bilateral Filters," Information Fusion, vol. 30, pp. 15-26, 2016.   DOI
10 S. Li, X. Kang, and J. Hu, "Image Fusion With Guided Filtering," IEEE Transactions on Image Processing, vol. 22, no. 7, pp. 2864-2875, 2013.   DOI
11 W. Gan, X. Wu, W. Wu, X. Yang, C. Ren, X. He, and K. Liu, "Infrared and Visible Image Fusion with the Use of Multi-Scale Edge-Preserving Decomposition and Guided Image Filter," Infrared Physics and Technology, vol. 72, pp. 37-51, 2015.   DOI
12 J. Zhu, W. Jin, L. Li, Z. Han, and X. Wang, "Multiscale infrared and visible image fusion using gradient domain guided image filtering," Infrared Physics & Technology, vol. 89, pp. 8-19, 2017.   DOI
13 Y. Liu, S. Liu, and Z. Wang, "A General Framework for Image Fusion based on Multi-Scale Transform and Sparse Representation," Information Fusion, vol. 24, pp. 147-164, 2015.   DOI
14 D. P. Bavirisetti, G. Xiao, and G. Liu, "Multi-sensor image fusion based on fourth order partial differential equations," in Proc. of International Conference on Information Fusion, 2017.
15 J. Ma, Z. Zhou, B. Wang, and H. Zong, "Infrared and visible image fusion based on visual saliency map and weighted least square optimization," Infrared Physics & Technology, vol. 82, pp. 8-17, 2017.   DOI
16 X. Zhang, Y. Ma, F. Fan, Y. Zhang, and J. Huang "Infrared and visible image fusion via saliency analysis and local edge-preserving multi-scale decomposition," Journal of the Optical Society of America A, vol. 34, no. 8, pp. 1400-1410, 2017.   DOI
17 J. Ma, C. Chen, C. Li, and J. Huang, "Infrared and visible image fusion via gradient transfer and total variation minimization," Information Fusion, vol. 31, pp. 100-109, 2016.
18 Y. Liu, X. Chen, H. Peng, and Z. Wang, "Multi-focus image fusion with a deep convolutional neural network," Information Fusion, vol. 36, pp. 191-207, 2017.   DOI
19 K. Xu, Z. Qin, G. Wang, H. Zhang, K. Huang, and S. Ye, "Multi-focus Image Fusion using Fully Convolutional Two-stream Network for Visual Sensors," KSII Transactions on Internet and Information Systems, vol. 12, no. 5, pp. 2253-2272, 2018.   DOI
20 J. Ma, W. Yu, P. Liang, C. Li, and J. Jiang, "FusionGAN: A generative adversarial network for infrared and visible image fusion," Information Fusion, vol. 48, pp. 11-26, 2019.   DOI
21 K. He, J. Sun, and X. Tang, "Guided Image Filtering," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, no. 6, pp. 1397-1409, 2013.   DOI
22 C. Tomasi and R. Manduchi, "Bilateral filtering for gray and color images," in Proc. of Sixth International Conference on Computer Vision, pp. 839-846, 1998.
23 F. Banterle, M. Corsini, P. Cignoni, and R. Scopigno, "A Low-Memory, Straightforward and Fast Bilateral Filter Through Subsampling in Spatial Domain," in Proc. of Computer Graphics Forum, vol. 31, no. 1, pp. 19-32, 2012.   DOI
24 G. Petschnigg, R. Szeliski, M. Agrawala, M. Cohen, H. Hoppe, and K. Toyama, "Digital photography with flash and no-flash image pairs," ACM Transactions on Graphics (TOG), vol. 23, no. 3, pp. 664-672, 2004.   DOI
25 F. Kou, W. Chen, C. Wen, and Z. Li "Gradient Domain Guided Image Filtering," IEEE Transactions on Image Processing, vol. 24, no. 11, pp. 4528-4539, 2015.   DOI
26 G. Qu, D. Zhang, and P. Yan, "Information measure for performance of image fusion," Electronics Letters, vol. 38, no. 7, pp. 313-315, 2002.   DOI
27 N. Cvejic, C. N. Canagarajah, and D. R. Bull, "Image fusion metric based on mutual information and Tsallis entropy," Electronics Letters, vol. 42, no. 11, pp. 626-627, 2006.   DOI
28 C. S. Xydeas and V. Petrovic, "Objective image fusion performance measure," Electronics Letters, vol. 36, no. 4, pp. 308-309, 2000.   DOI
29 A. Toet, J.K. Ijspeert, A.M. Waxman, and M. Aguilar, "Fusion of visible and thermal imagery improves situational awareness," Displays, vol. 18, no. 2, pp. 85-95, 1997.   DOI
30 J. W. Davis and V. Sharma, "Background-subtraction using contour-based fusion of thermal and visible imagery," Computer Vision and Image Understanding, vol. 106, no. 2-3, pp. 162-182, 2007.   DOI
31 J. J. Lewis, S. G. Nikolov, A. Loza, E. F. Canga, N. Cvejic, J. Li, A. Cardinali, C. N. Canagarajah, D. R. Bull, T. Riley, D. Hickman, and M. I. Smith. "The Eden Project multi-sensor data set," Technical report TR-UoB-WS-Eden-Project-Data-Set, University of Bristol and Waterfall Solutions Ltd, 2006.