• Title/Summary/Keyword: Nonlinear conjugate gradient method

Search Result 20, Processing Time 0.024 seconds

A NONLINEAR CONJUGATE GRADIENT METHOD AND ITS GLOBAL CONVERGENCE ANALYSIS

  • CHU, AJIE;SU, YIXIAO;DU, SHOUQIANG
    • Journal of applied mathematics & informatics
    • /
    • v.34 no.1_2
    • /
    • pp.157-165
    • /
    • 2016
  • In this paper, we develop a new hybridization conjugate gradient method for solving the unconstrained optimization problem. Under mild assumptions, we get the sufficient descent property of the given method. The global convergence of the given method is also presented under the Wolfe-type line search and the general Wolfe line search. The numerical results show that the method is also efficient.

A Study on the Application of Conjugate Gradient Method in Nonlinear Magnetic Field Analysis by FEM. (유한요소법에 의한 비선형 자계 해석에 공액 구배법 적응 연구)

  • 임달호;신흥교
    • The Transactions of the Korean Institute of Electrical Engineers
    • /
    • v.39 no.1
    • /
    • pp.22-28
    • /
    • 1990
  • This paper is a study on the reduction of computation time in case of nonlinear magnetic field analysis by finite element method and Newton-Raphson method. For the purpose, the nonlinear convergence equation is computed by the conjugate gradient method which is known to be applicable to symmetric positive definite matrix equations only. As the results, we can not prove mathematically that the system Jacobian is positive definite, but when we applied this method, the diverging case did not occur. And the computation time is reduced by 25-55% and 15-45% in comparison with the case of direct and successive over-relaxation method, respectively. Therefore, we proved the utility of conjugate gradient method.

  • PDF

A LOGARITHMIC CONJUGATE GRADIENT METHOD INVARIANT TO NONLINEAR SCALING

  • Moghrabi, I.A.
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.8 no.2
    • /
    • pp.15-21
    • /
    • 2004
  • A Conjugate Gradiant (CG) method is proposed for unconstained optimization which is invariant to a nonlinear scaling of a strictly convex quadratic function. The technique has the same properties as the classical CG-method when applied to a quadratic function. The algorithm derived here is based on a logarithmic model and is compared to the standard CG method of Fletcher and Reeves [3]. Numerical results are encouraging and indicate that nonlinear scaling is promising and deserves further investigation.

  • PDF

Conjugate finite-step length method for efficient and robust structural reliability analysis

  • Keshtegar, Behrooz
    • Structural Engineering and Mechanics
    • /
    • v.65 no.4
    • /
    • pp.415-422
    • /
    • 2018
  • The Conjugate Finite-Step Length" (CFSL) algorithm is proposed to improve the efficiency and robustness of first order reliability method (FORM) for reliability analysis of highly nonlinear problems. The conjugate FORM-based CFSL is formulated using the adaptive conjugate search direction based on the finite-step size with simple adjusting condition, gradient vector of performance function and previous iterative results including the conjugate gradient vector and converged point. The efficiency and robustness of the CFSL algorithm are compared through several nonlinear mathematical and structural/mechanical examples with the HL-RF and "Finite-Step-Length" (FSL) algorithms. Numerical results illustrated that the CFSL algorithm performs better than the HL-RF for both robust and efficient results while the CFLS is as robust as the FSL for structural reliability analysis but is more efficient.

A NEW CONJUGATE GRADIENT MINIMIZATION METHOD BASED ON EXTENDED QUADRATIC FUNCTIONS

  • Moghrabi, Issam.A.R.
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.8 no.2
    • /
    • pp.7-13
    • /
    • 2004
  • A Conjugate Gradient (CG) algorithm for unconstrained minimization is proposed which is invariant to a nonlinear scaling of a strictly convex quadratic function and which generates mutually conjugate directions for extended quadratic functions. It is derived for inexact line searches and is designed for the minimization of general nonlinear functions. It compares favorably in numerical tests with the original Dixon algorithm on which the new algorithm is based.

  • PDF

A NEW CLASS OF NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION MODELS AND ITS APPLICATION IN PORTFOLIO SELECTION

  • Malik, Maulana;Sulaiman, Ibrahim Mohammed;Mamat, Mustafa;Abas, Siti Sabariah;Sukono, Sukono
    • Nonlinear Functional Analysis and Applications
    • /
    • v.26 no.4
    • /
    • pp.811-837
    • /
    • 2021
  • In this paper, we propose a new conjugate gradient method for solving unconstrained optimization models. By using exact and strong Wolfe line searches, the proposed method possesses the sufficient descent condition and global convergence properties. Numerical results show that the proposed method is efficient at small, medium, and large dimensions for the given test functions. In addition, the proposed method was applied to solve practical application problems in portfolio selection.

AN AFFINE SCALING INTERIOR ALGORITHM VIA CONJUGATE GRADIENT AND LANCZOS METHODS FOR BOUND-CONSTRAINED NONLINEAR OPTIMIZATION

  • Jia, Chunxia;Zhu, Detong
    • Journal of applied mathematics & informatics
    • /
    • v.29 no.1_2
    • /
    • pp.173-190
    • /
    • 2011
  • In this paper, we construct a new approach of affine scaling interior algorithm using the affine scaling conjugate gradient and Lanczos methods for bound constrained nonlinear optimization. We get the iterative direction by solving quadratic model via affine scaling conjugate gradient and Lanczos methods. By using the line search backtracking technique, we will find an acceptable trial step length along this direction which makes the iterate point strictly feasible and the objective function nonmonotonically decreasing. Global convergence and local superlinear convergence rate of the proposed algorithm are established under some reasonable conditions. Finally, we present some numerical results to illustrate the effectiveness of the proposed algorithm.

GLOBAL CONVERGENCE OF AN EFFICIENT HYBRID CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION

  • Liu, Jinkui;Du, Xianglin
    • Bulletin of the Korean Mathematical Society
    • /
    • v.50 no.1
    • /
    • pp.73-81
    • /
    • 2013
  • In this paper, an efficient hybrid nonlinear conjugate gradient method is proposed to solve general unconstrained optimization problems on the basis of CD method [2] and DY method [5], which possess the following property: the sufficient descent property holds without any line search. Under the Wolfe line search conditions, we proved the global convergence of the hybrid method for general nonconvex functions. The numerical results show that the hybrid method is especially efficient for the given test problems, and it can be widely used in scientific and engineering computation.

Solving a Matrix Polynomial by Conjugate Gradient Methods

  • Ko, Hyun-Ji;Kim, Hyun-Min
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.11 no.4
    • /
    • pp.39-46
    • /
    • 2007
  • One of well known and much studied nonlinear matrix equations is the matrix polynomial which has the form G(X)=$A_0X^m+A_1X^{m-1}+{\cdots}+A_m$ where $A_0$, $A_1$, ${\cdots}$, $A_m$ and X are $n{\times}n$ real matrices. We show how the minimization methods can be used to solve the matrix polynomial G(X) and give some numerical experiments. We also compare Polak and Ribi$\acute{e}$re version and Fletcher and Reeves version of conjugate gradient method.

  • PDF

Iris Recognition using Multi-Resolution Frequency Analysis and Levenberg-Marquardt Back-Propagation

  • Jeong Yu-Jeong;Choi Gwang-Mi
    • Journal of information and communication convergence engineering
    • /
    • v.2 no.3
    • /
    • pp.177-181
    • /
    • 2004
  • In this paper, we suggest an Iris recognition system with an excellent recognition rate and confidence as an alternative biometric recognition technique that solves the limit in an existing individual discrimination. For its implementation, we extracted coefficients feature values with the wavelet transformation mainly used in the signal processing, and we used neural network to see a recognition rate. However, Scale Conjugate Gradient of nonlinear optimum method mainly used in neural network is not suitable to solve the optimum problem for its slow velocity of convergence. So we intended to enhance the recognition rate by using Levenberg-Marquardt Back-propagation which supplements existing Scale Conjugate Gradient for an implementation of the iris recognition system. We improved convergence velocity, efficiency, and stability by changing properly the size according to both convergence rate of solution and variation rate of variable vector with the implementation of an applied algorithm.