• 제목/요약/키워드: Local superlinear convergence

검색결과 6건 처리시간 0.019초

AFFINE INVARIANT LOCAL CONVERGENCE THEOREMS FOR INEXACT NEWTON-LIKE METHODS

  • Argyros, Ioannis K.
    • Journal of applied mathematics & informatics
    • /
    • 제6권2호
    • /
    • pp.393-406
    • /
    • 1999
  • Affine invariant sufficient conditions are given for two local convergence theorems involving inexact Newton-like methods. The first uses conditions on the first Frechet-derivative whereas the second theorem employs hypotheses on the second. Radius of con-vergence as well as rate of convergence results are derived. Results involving superlinear convergence and known to be true for inexact Newton methods are extended here. Moreover we show that under hypotheses on the second Frechet-derivation our radius of convergence results are derived. Results involving superlinear convergence and known to be true or inexact Newton methods are extended here. Moreover we show that under hypotheses on the second Frechet-derivative our radius of conver-gence is larger than the corresponding one in [10]. This allows a wider choice for the initial guess. A numerical example is also pro-vided to show that our radius of convergence is larger then the one in [10].

LOCAL CONVERGENCE THEOREMS FOR NEWTON METHODS

  • Argyros, Ioannis K.
    • Journal of applied mathematics & informatics
    • /
    • 제8권2호
    • /
    • pp.345-360
    • /
    • 2001
  • Affine invariant sufficient conditions are given for two local convergence theorems involving inexact Newton-like methods. The first uses conditions on the first Frechet-derivative whereas the second theorem employs hypotheses on the mth(m≥2 an integer). Radius of convergence as well as rate of convergence results are derived. Results involving superlinear convergence and known to be true for inexact Newton methods are extended here. Moreover, we show that under hypotheses on the mth Frechet-derivative our radius of convergence can sometimes be larger than the corresponding one in [10]. This allows a wider choice for the initial guess. A numerical example is also provided to show that our radius of convergence is larger than the one in [10].

A TYPE OF MODIFIED BFGS ALGORITHM WITH ANY RANK DEFECTS AND THE LOCAL Q-SUPERLINEAR CONVERGENCE PROPERTIES

  • Ge Ren-Dong;Xia Zun-Quan;Qiang Guo
    • Journal of applied mathematics & informatics
    • /
    • 제22권1_2호
    • /
    • pp.193-208
    • /
    • 2006
  • A modified BFGS algorithm for solving the unconstrained optimization, whose Hessian matrix at the minimum point of the convex function is of rank defects, is presented in this paper. The main idea of the algorithm is first to add a modified term to the convex function for obtain an equivalent model, then simply the model to get the modified BFGS algorithm. The superlinear convergence property of the algorithm is proved in this paper. To compared with the Tensor algorithms presented by R. B. Schnabel (seing [4],[5]), this method is more efficient for solving singular unconstrained optimization in computing amount and complication.

A SMOOTHING NEWTON METHOD FOR NCP BASED ON A NEW CLASS OF SMOOTHING FUNCTIONS

  • Zhu, Jianguang;Hao, Binbin
    • Journal of applied mathematics & informatics
    • /
    • 제32권1_2호
    • /
    • pp.211-225
    • /
    • 2014
  • A new class of smoothing functions is introduced in this paper, which includes some important smoothing complementarity functions as its special cases. Based on this new smoothing function, we proposed a smoothing Newton method. Our algorithm needs only to solve one linear system of equations. Without requiring the nonemptyness and boundedness of the solution set, the proposed algorithm is proved to be globally convergent. Numerical results indicate that the smoothing Newton method based on the new proposed class of smoothing functions with ${\theta}{\in}(0,1)$ seems to have better numerical performance than those based on some other important smoothing functions, which also demonstrate that our algorithm is promising.

ANALYSIS OF SMOOTHING NEWTON-TYPE METHOD FOR NONLINEAR COMPLEMENTARITY PROBLEMS

  • Zheng, Xiuyun
    • Journal of applied mathematics & informatics
    • /
    • 제29권5_6호
    • /
    • pp.1511-1523
    • /
    • 2011
  • In this paper, we consider the smoothing Newton method for the nonlinear complementarity problems with $P_0$-function. The proposed algorithm is based on a new smoothing function and it needs only to solve one linear system of equations and perform one line search per iteration. Under the condition that the solution set is nonempty and bounded, the proposed algorithm is proved to be convergent globally. Furthermore, the local superlinearly(quadratic) convergence is established under suitable conditions. Preliminary numerical results show that the proposed algorithm is very promising.

AN AFFINE SCALING INTERIOR ALGORITHM VIA CONJUGATE GRADIENT AND LANCZOS METHODS FOR BOUND-CONSTRAINED NONLINEAR OPTIMIZATION

  • Jia, Chunxia;Zhu, Detong
    • Journal of applied mathematics & informatics
    • /
    • 제29권1_2호
    • /
    • pp.173-190
    • /
    • 2011
  • In this paper, we construct a new approach of affine scaling interior algorithm using the affine scaling conjugate gradient and Lanczos methods for bound constrained nonlinear optimization. We get the iterative direction by solving quadratic model via affine scaling conjugate gradient and Lanczos methods. By using the line search backtracking technique, we will find an acceptable trial step length along this direction which makes the iterate point strictly feasible and the objective function nonmonotonically decreasing. Global convergence and local superlinear convergence rate of the proposed algorithm are established under some reasonable conditions. Finally, we present some numerical results to illustrate the effectiveness of the proposed algorithm.