DOI QR코드

DOI QR Code

A NEW CLASS OF NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION MODELS AND ITS APPLICATION IN PORTFOLIO SELECTION

  • Malik, Maulana (Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin) ;
  • Sulaiman, Ibrahim Mohammed (School of Quantitative Sciences, College of Arts and Sciences, Universiti Utara Malaysia) ;
  • Mamat, Mustafa (Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin) ;
  • Abas, Siti Sabariah (Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin) ;
  • Sukono, Sukono (Department of Mathematics, Universitas Padjadjaran)
  • Received : 2021.03.23
  • Accepted : 2021.05.09
  • Published : 2021.12.15

Abstract

In this paper, we propose a new conjugate gradient method for solving unconstrained optimization models. By using exact and strong Wolfe line searches, the proposed method possesses the sufficient descent condition and global convergence properties. Numerical results show that the proposed method is efficient at small, medium, and large dimensions for the given test functions. In addition, the proposed method was applied to solve practical application problems in portfolio selection.

Keywords

References

  1. A.B. Abubakar, P. Kumam and A.M. Awwal, Global convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recovery , Results Appl. Math., 4 (2019), p.100069. https://doi.org/10.1016/j.rinam.2019.100069
  2. A.B. Abubakar, P. Kumam, M. Malik, P. Chaipunya, and A.H. Ibrahim, A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection, AIMS Mathematics, 6(6) (2021), 6506-6527. https://doi.org/10.3934/math.2021383
  3. M. Al-Baali, Descent property and global convergence of the Fletcher-Reeves method with inexact line search, IMA J. Numer. Anal., 5(1) (1985), 121-124. https://doi.org/10.1093/imanum/5.1.121
  4. N. Andrei, An Unconstrained Optimization Test Functions Collection, Adv. Model. Optim., 10(1) (2008), 147-161.
  5. L. Armijo, Minimization of functions having Lipschitz continuous first partial derivatives, Pacific J. Math., 16(1) (1966), 1-3. https://doi.org/10.2140/pjm.1966.16.1
  6. J. Cao and J. Wu, A conjugate gradient algorithm and its applications in image restoration, Appl. Numer. Math., 152 (2020), 243-252. https://doi.org/10.1016/j.apnum.2019.12.002
  7. Y.H. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim., 10(1) (1999), 177-182. https://doi.org/10.1137/S1052623497318992
  8. E.D. Dolan and J.J. More, Benchmarking optimization software with performance profiles, Math. Progr., 91(2) (2002), 201-213. https://doi.org/10.1007/s101070100263
  9. R. Fletcher and C.M. Reeves, Function minimization by conjugate gradients, The Comput. J., 2 (1964), 149-154. https://doi.org/10.1093/comjnl/7.2.149
  10. R. Fletcher, Practical methods of optimization, Wiley Interscience John Wiley and Sons, New York, USA, 1987.
  11. J.C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim., 2(1) (1992), 21-42. https://doi.org/10.1137/0802003
  12. M. Jamil and X.S. Yang, A literature survey of benchmark functions for global optimisation problems, Inter. J. Math. Model. Numer. Optim., 4(2) (2013), 150-194. https://doi.org/10.1504/IJMMNO.2013.055204
  13. J.K. Liu, Y.M. Feng and L.M. Zou, A spectral conjugate gradient method for solving large-scale unconstrained optimization, Comput. Math. Appl., 77(3) (2019), 731-739. https://doi.org/10.1016/j.camwa.2018.10.002
  14. M. Malik, S.S. Abas, M. Mamat, Sukono and I.S. Mohammed, A new hybrid conjugate gradient method with global convergence properties, Inter. J. Advan. Sci. Techn., 29(5) (2020), 199-210.
  15. M. Malik, M. Mamat, S.S. Abas and Sukono, Convergence analysis of a new coefficient conjugate gradient method under exact line search, Inter. J. Advan. Sci. Techn., 29(5) (2020), 187-198.
  16. M. Malik, M. Mamat, S.S. Abas, I.M. Sulaiman and Sukono, A new coefficient of the conjugate gradient method with the sufficient descent condition and global convergence properties, Engineering Letters, 28(3) (2020), 704-714.
  17. M. Malik, M. Mamat, S.S. Abas, I.M. Sulaiman and Sukono, A new spectral conjugate gradient method with descent condition and global convergence property for unconstrained optimization, J. Math. Comput. Sci., 10(5) (2020), 2053-2069.
  18. M. Malik, M. Mamat, S.S. Abas, I.M. Sulaiman and Sukono, A new modification of NPRP conjugate gradient method for unconstrained optimization, Advan. Math.: Scientific J., 9(7) (2020), 4955-4970. https://doi.org/10.37418/amsj.9.7.61
  19. M. Malik, M. Mamat, S.S. Abas, I.M. Sulaiman and Sukono, Performance analysis of new spectral and hybrid conjugate gradient methods for solving unconstrained optimization problems, IAENG Inter. J. Comput. Sci., 48(1) (2021), 66-79.
  20. P. Mtagulwa and P. Kaelo, An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems, Appl. Numer. Math.,, 145 (2019), 111-120. https://doi.org/10.1016/j.apnum.2019.06.003
  21. J. Nocedal and S.J. Wright, Numerical Optimization, Springer, New York, 2000.
  22. E. Polak, Algorithms and Consistent Approximations, Springer, Berlin, 1997.
  23. E. Polak and G. Ribiere,Note sur la convergence de m'ethodes de directions conjugu'ees, ESAIM: Math. Model. Numer. Anal.-Mod'elisation Math'ematique et Analyse Numerique, 3 (1969), 35-43.
  24. R. Pytlak, Conjugate gradient algorithms in nonconvex optimization, Springer Science & Business Media, 89, 2008.
  25. B.T. Polyak, The conjugate gradient method in extremal problems, USSR Comput. Math. Math. Phy., 9(4) (1969), 94-112. https://doi.org/10.1016/0041-5553(69)90035-4
  26. M. Rivaie, M. Mamat and A. Abashar, A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches, Appl. Math. Comput., 268 (2015), 1152-1163. https://doi.org/10.1016/j.amc.2015.07.019
  27. M. Rivaie, M. Mamat, L.W. June and I. Mohd, A new class of nonlinear conjugate gradient coefficients with global convergence properties, Appl. Math. Comput., 218(22) (2012), 11323-11332. https://doi.org/10.1016/j.amc.2012.05.030
  28. S. Roman, Introduction to the mathematics of finance: from risk management to options pricing, Springer Science & Business Media, 2004.
  29. Z.J. Shi and J. Shen, Convergence of PRP method with new nonmonotone line search, Applied Math. Comput., 181(1) (2006), 423-431. https://doi.org/10.1016/j.amc.2005.12.064
  30. I. M. Sulaiman, M. Mamat, M.Y. Waziri, U.A. Yakubu and M. Malik, The Performance Analysis of a New Modification of Conjugate Gradient Parameter for Unconstrained Optimization Models, Math. Statistics, 9(1) (2021), 16-23. https://doi.org/10.13189/ms.2021.090103
  31. Z. Wang, P. Li, X. Li and H. Pham, A Modified Three-Term Type CD Conjugate Gradient Algorithm for Unconstrained Optimization Problems, Math. Prob. Engin., 2020 (2020), Article ID 4381515.
  32. M.Y. Waziri, K. Ahmed and J. Sabi'u, A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations, Appl. Math. Comput., 361 (2019), 645-660. https://doi.org/10.1016/j.amc.2019.06.012
  33. Z. Wei, S. Yao and L. Liu, The convergence properties of some new conjugate gradient methods, Appl. Math. Comput., 183(2) (2006), 1341-1350. https://doi.org/10.1016/j.amc.2006.05.150
  34. P. Wolfe, Convergence conditions for ascent methods, SIAM review, 11(2) (1969), 226-235. https://doi.org/10.1137/1011036
  35. P. Wolfe, Convergence conditions for ascent methods. II: Some corrections, SIAM review, 13(2) (1971), 185-188. https://doi.org/10.1137/1013035
  36. K. Yang, J. Geng-Hui, Q. Qu, H.F. Peng and X.W. Gao, A new modified conjugate gradient method to identify thermal conductivity of transient non-homogeneous problems based on radial integration boundary element method, Inter. J. Heat and Mass Transfer, 133 (2019), 669-676. https://doi.org/10.1016/j.ijheatmasstransfer.2018.12.145
  37. S. Yao, Q. Feng, L. Li and J. Xu, A class of globally convergent three-term Dai-Liao conjugate gradient methods, Appl. Numer. Math., 151 (2020), 354-366. https://doi.org/10.1016/j.apnum.2019.12.026
  38. O.O.O. Yousif, The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search, Appl. Math. Comput., 367 (2020), p.124777. https://doi.org/10.1016/j.amc.2019.124777
  39. G. Yuan, T. Li and W. Hu, A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems, Appl. Numer. Math., 147 (2020), 129-141. https://doi.org/10.1016/j.apnum.2019.08.022
  40. Y. Yuan and W.Y. Sun, Optimization Theory and Methods, Science Press, Beijing, 1997.
  41. L. Zhang, An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation, Appl. Math. Comput., 215(6) (2009), 2269-2274. https://doi.org/10.1016/j.amc.2009.08.016
  42. Z. Zhu, D. Zhang and S. Wang, Two modified DY conjugate gradient methods for unconstrained optimization problems, Appl. Math. Comput., 373 (2020), p.125004. https://doi.org/10.1016/j.amc.2019.125004
  43. G. Zoutendijk, Nonlinear programming, computational methods, Integer and nonlinear programming, 1970.