Browse > Article
http://dx.doi.org/10.7465/jkdi.2012.23.5.1027

An accelerated Levenberg-Marquardt algorithm for feedforward network  

Kwak, Young-Tae (Department of IT Engineering, Chonbuk National University)
Publication Information
Journal of the Korean Data and Information Science Society / v.23, no.5, 2012 , pp. 1027-1035 More about this Journal
Abstract
This paper proposes a new Levenberg-Marquardt algorithm that is accelerated by adjusting a Jacobian matrix and a quasi-Hessian matrix. The proposed method partitions the Jacobian matrix into block matrices and employs the inverse of a partitioned matrix to find the inverse of the quasi-Hessian matrix. Our method can avoid expensive operations and save memory in calculating the inverse of the quasi-Hessian matrix. It can shorten the training time for fast convergence. In our results tested in a large application, we were able to save about 20% of the training time than other algorithms.
Keywords
Error backpropagation; Levenberg-Marquardt algorithm; multilayer perceptrons;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 Setiono, R. and Hui, L. C. K. (1995). Use of a quasi-Newton method in a feedforward neural network construction algorithm. IEEE Transactions on Neural Networks, 6, 273-277.   DOI   ScienceOn
2 Vogl, T., Mangis, J., Rigler, A., Zink, W. and Alkon, D. (1988). Accelerating the convergence of the back- propagation method. Biological Cybernetics, 59, 257-263.   DOI
3 Wilamowski, B. M. and Yu, H. (2010). Improved computation for Levenberg-Marquardt training. IEEE Transactions on Neural Networks, 21, 930-937.   DOI   ScienceOn
4 Yu, X.-H., Chen, G.-A., and Cheng, S.-X. (1995). Dynamic learning rate optimization of the backpropagation algorithm. IEEE Transactions on Neural Networks, 6, 669-677.   DOI   ScienceOn
5 Buntine, W. L. and Weigend, A. S. (1994). Computing second derivatives in feed-forward networks: A review. IEEE Transactions on Neural Networks, 5, 480-488.   DOI   ScienceOn
6 Chan, L.-W. and Szeto, C.-C. (1999). Training recurrent network with block-diagonal approximated Levenberg- Marquardt algorithm. In International Joint Conference on Neural Networks, 1521-1526.
7 Charalambous, C. (1992). Conjugate gradient algorithm for ecient training of arti cial neural networks. IEEE Proceedings, 139, 301-310.   DOI
8 Hagan, M. T. and Menhaj, M. B. (1994). Training feedforward networks with the Marquardt algorithm. IEEE Transactions on Neural Networks, 5, 989-993.   DOI   ScienceOn
9 Hull, J. J. (1994). A database for handwritten text recognition research. IEEE Transactions on Pattern Analysis and Machine Intelligence, 16, 550-554.   DOI   ScienceOn
10 Lera, G. and Pinzolas, M. (2002). Neighborhood based Levenberg-Marquardt algorithm for neural network training. IEEE Transactions on Neural Networks, 13, 1200-1203.   DOI   ScienceOn
11 Oh, S.-H. and Lee, Y. (1995). A modi ed error function to improve the error back-propagation algorithm for multi-layer perceptrons. ETRI Journal, 17, 11-22.
12 Lippmann, R. (1987). An introduction to computing with neural nets. IEEE ASSP Magazine, 4, 4-22.   DOI   ScienceOn
13 Na, M. W. and Kwon, Y. M. (2010). Alternative optimization procedure for parameter design using neural network without SN. Journal of the Korean Data & information Science Society, 21, 211-218.
14 Oh, K. J., Kim, T. Y., Jung, K. and Kim, C. (2011). Stock market stability index via linear and neural network autoregressive model. Journal of the Korean Data & information Science Society, 22, 335-351.
15 Saad, Y. (2003). Iterative methods for sparse linear systems, SIAM, Philadelphia.