Browse > Article
http://dx.doi.org/10.3745/KTSDE.2020.9.12.403

Clustering Performance Analysis of Autoencoder with Skip Connection  

Jo, In-su (단국대학교 컴퓨터공학부)
Kang, Yunhee (백석대학교 ICT학부)
Choi, Dong-bin (단국대학교 컴퓨터공학부)
Park, Young B. (단국대학교 소프트웨어학과)
Publication Information
KIPS Transactions on Software and Data Engineering / v.9, no.12, 2020 , pp. 403-410 More about this Journal
Abstract
In addition to the research on noise removal and super-resolution using the data restoration (Output result) function of Autoencoder, research on the performance improvement of clustering using the dimension reduction function of autoencoder are actively being conducted. The clustering function and data restoration function using Autoencoder have common points that both improve performance through the same learning. Based on these characteristics, this study conducted an experiment to see if the autoencoder model designed to have excellent data recovery performance is superior in clustering performance. Skip connection technique was used to design autoencoder with excellent data recovery performance. The output result performance and clustering performance of both autoencoder model with Skip connection and model without Skip connection were shown as graph and visual extract. The output result performance was increased, but the clustering performance was decreased. This result indicates that the neural network models such as autoencoders are not sure that each layer has learned the characteristics of the data well if the output result is good. Lastly, the performance degradation of clustering was compensated by using both latent code and skip connection. This study is a prior study to solve the Hanja Unicode problem by clustering.
Keywords
Skip Connection; Autoencoder; Clustering; Superresolution;
Citations & Related Records
연도 인용수 순위
  • Reference
1 C. Ding and X. He. "K-means clustering and principal component analysis." ICML, 2004.
2 R. Raskin and H. Terry, "A principal components analysis of the Narcissistic Personality Inventory and further evidence of its construct validlity," Journal of Personality and Social Psychology, Vol.54, pp.890-902, 1988.   DOI
3 van der Maaten, L. and Hinton, G. "Visualizing data using t-sne," Journal of Machine Learning Research, Vol.9, 2008.
4 G. E. Hinton and S. T. Roweis. "Stochastic Neighbor Embedding," In Advances in Neural Information Processing Systems, Vol.15, pp.833-840, Cambridge, MA, USA, 2002. The MIT Press.
5 X., Peng, J. Feng, J. Lu, W. Y. Yau, and Z. Yi, "Cascade subspace clustering," In: AAAI Conference on Artificial Intelligence (AAAI). pp.2478-2484, 2017.
6 X. Peng, S. Xiao, J. Feng, W.Y. Yau, and Z. Yi, "Deep subspace clustering with sparsity prior," In: International Joint Conference on Artificial Intelligence (IJCAI), 2016.
7 J. Xie, R. Girshick, and A. Farhadi, "Unsupervised deep embedding for clustering analysis," In: International Conference on Machine Learning (ICML) (2016).
8 H. Liu, R. Xiong, J. Zhang, and W. Gao, "Image denoising viaadaptive soft-thresholding based on non-local samples," in 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2015, pp.484-492.
9 F. Chen, L. Zhang, and H. Yu, "External patch prior guided internal clustering for image denoising," in 2015 IEEE International Conference on Computer Vision (ICCV), pp.603-611, 2015.
10 R. Timofte, V. D. Smet, and L. J. V. Gool, "A+: adjusted anchored neighborhood regression for fast superresolution," in Proc. Asian Conf. Comp. Vis., pp.111-126, 2014.
11 J. Yang, Z. Lin, and S. Cohen, "Fast image super-resolution based on in-place example regression," in 2013 IEEE Conference on Computer Vision and Pattern Recognition, pp.1059-1066, 2013.
12 M. A. Kramer, "Autoassociative neural networks," Computers & Chemical Engineering, Vol.16, No.4, pp.313-328, 1992.   DOI
13 Xiao-Jiao Mao, Chunhua Shen, and Yu-Bin Yang, "Image restoration using convolutional autoencoders with symmetric skip connections," CoRR, abs/1606.08921, 2016a.
14 J. Long, E. Shelhamer, and T. Darrell, "Fully convolutional networks for semantic segmentation," In Computer Vision and Pattern Recognition, 2015.
15 K. He, X. Zhang, S. Ren, and J. Sun, "Deep residual learning for image recognition," In Computer Vision and Pattern Recognition, 2016
16 Jeonghyeon Lee, "Problems with Chinese Ideographs Search in Unicode and Solutions to Them," Informatization Policy, Vol.19, No.3, pp.50-63, 2012.
17 H. Xiao, K. Rasul, and R. Vollgraf. Fashion-mnist: "a novel image dataset for benchmarking machine learning algorithms," arXiv preprint arXiv:1708.07747, 2017.
18 P. Vincent, H. Larochelle, Y. Bengio, P. A. Manzagol, "Extracting and composing robust features with denoising autoencoders," In: International Conference on Machine learning, pp.1096-1103, 2008.
19 K. He, G. Gkioxari, P. Dollar, and R. Girshick, "Mask r-cnn," arXiv:1703.06870, 2017.
20 G. Montufar, R. Pascanu, K. Cho, and Y. Bengio, "On the number of linear regions of deep neural networks," In Neural Information Processing Systems, 27, 2014.
21 P. Grother, "NIST special database 19 handprinted forms and characters database," National Institute of Standards and Technology, 1995.
22 Hochreiter, S. "The vanishing gradient problem during learning recurrent neural nets and problem solutions," International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, Vol.6, No.2, pp.107-116, 1998.   DOI
23 X. Huo, X. Liu, E. Zheand and J. Yin, "Deep Clustering with Convolutional auto-encoders, International Conference on Neural Information Processing, pp.373-382, 2017.
24 P. Ji, T. Zhang, H. Li, M. Salzmann, and I. Reid, "Deep subspace clustering networks," in Neural Information Processing Systems, QC, Canada, pp.24-33, Dec. 2017.
25 S. Yang, W. Zhu, and Y. Zhu, "Residual encoder-decoder network for deep subspace clustering," arXiv:1910.05569, 2019, [online] Available: https://arxiv.org/abs/1910.05569