Browse > Article
http://dx.doi.org/10.4218/etrij.2017-0304

Projection spectral analysis: A unified approach to PCA and ICA with incremental learning  

Kang, Hoon (School of Electrical and Electronics Engineering, Chung-Ang University)
Lee, Hyun Su (School of Electrical and Electronics Engineering, Chung-Ang University)
Publication Information
ETRI Journal / v.40, no.5, 2018 , pp. 634-642 More about this Journal
Abstract
Projection spectral analysis is investigated and refined in this paper, in order to unify principal component analysis and independent component analysis. Singular value decomposition and spectral theorems are applied to nonsymmetric correlation or covariance matrices with multiplicities or singularities, where projections and nilpotents are obtained. Therefore, the suggested approach not only utilizes a sum-product of orthogonal projection operators and real distinct eigenvalues for squared singular values, but also reduces the dimension of correlation or covariance if there are multiple zero eigenvalues. Moreover, incremental learning strategies of projection spectral analysis are also suggested to improve the performance.
Keywords
independent component analysis; machine learning; neural network; principal component analysis; projection spectral analysis; singular value decomposition; spectral theorem;
Citations & Related Records
연도 인용수 순위
  • Reference
1 H. Kang, Projection spectral analysis, Int. J. Control, Autom. Syst. 13 (2015), no. 6, 1530-1537.   DOI
2 A. W. Naylor and G. R. Sell, Linear operator theory in engineering and science - Applied mathematical sciences, vol. 40 Springer-Verlag, Inc., New York, 1982.
3 D. Hebb, Organization of behavior, Science Edition, Inc., New York, 1961.
4 J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proc. Natl Acad. Sci. USA, Biophys. 79 (1982), 2554-2558.   DOI
5 B. Kosko, Bidirectional associative memories, IEEE Trans. System, Man, Cybern. 18 (1988), no. 1, 49-60.   DOI
6 M. Turk and A. Pentland, Eigenfaces for recognition, J. Cogn. Neurosci. 3 (1991), no. 1, 71-86.   DOI
7 J. V. Stone, Independent component analysis - A tutorial introduction, The MIT Press, Cambridge, MA, 2004.
8 A. Hyvarinen, Fast and robust fixed-point algorithms for independent component analysis, IEEE Trans. Neural Netw. 10 (1999), no. 3, 626-634.   DOI
9 Y. Bengio, Learning deep architecture for AI, Found. Trends Mach. Learn. 2 (2009), no. 1, 1-127.   DOI
10 Y. LeCunn, et al., Gradient-based learning applied to document recognition, Proc. IEEE 86 (1998), 1-46.
11 S. Shimizu, et al., A linear non-Gaussian acyclic model for causal discovery, J. Mach. Learn. Res. 7 (2006), 2003-2020.
12 Y. LeCunn, K. Kavukcuoglu, and C. Farabet, Convolutional networks and applications in vision, Proc. IEEE Int. Symp. Circuits Syst., Paris, France, 2010, pp. 253-256.
13 H. Kang, Associative cubes in unsupervised learning for robust gray-scale image recognition, Proc. 3rd Int. Symposium on Neural Networks, Advances in Neural Networks - ISNN 2006, Springer-Verlag, Berlin Heidelberg (J. Wang et al., ed.), vol. LNCS 3972, (2006), pp. 581-588.
14 C.-T. Chen, Linear system theory and design, Oxford University Press, Inc., New York, 1999.
15 V. Calhoun, et al., A method for making group inferences from functional MRI data using independent component analysis, Hum. Brain Mapp. 14 (2001), 140-151.   DOI
16 M. Gutmann and A. Hyvarinen, Noise-constrastive estimation: A new estimation principle for unnormalized statistical models, Proc. Int. Conf. Artif. Intell. Statistics Sardinia, Italy, May 13-15, 2010, pp. 297-304.
17 A. Hyvarinen, J. Karhunen, and E. Oja, Independent component analysis, John Wiley and Sons, Inc., New York, 2001.
18 G. E. Hinton, S. Osindero, and Y. W. Teh, A fast learning algorithm for deep belief nets, Neural Comput. 18 (2006), no. 7 1527-1554.   DOI
19 H. Kang, Multilayered associative neural networks (m. a. n. n.): Storage capacity vs. perfect recall, IEEE Trans. Neural Networks 5 (1994), no. 5, 812-822.   DOI