• 제목/요약/키워드: hessian

검색결과 109건 처리시간 0.026초

A DUAL ALGORITHM FOR MINIMAX PROBLEMS

  • HE SUXIANG
    • Journal of applied mathematics & informatics
    • /
    • 제17권1_2_3호
    • /
    • pp.401-418
    • /
    • 2005
  • In this paper, a dual algorithm, based on a smoothing function of Bertsekas (1982), is established for solving unconstrained minimax problems. It is proven that a sequence of points, generated by solving a sequence of unconstrained minimizers of the smoothing function with changing parameter t, converges with Q-superlinear rate to a Kuhn-Thcker point locally under some mild conditions. The relationship between the condition number of the Hessian matrix of the smoothing function and the parameter is studied, which also validates the convergence theory. Finally the numerical results are reported to show the effectiveness of this algorithm.

A SELF SCALING MULTI-STEP RANK ONE PATTERN SEARCH ALGORITHM

  • Moghrabi, Issam A.R.
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • 제15권4호
    • /
    • pp.267-275
    • /
    • 2011
  • This paper proposes a new quickly convergent pattern search quasi-Newton algorithm that employs the multi-step version of the Symmetric Rank One (SRI). The new algorithm works on the factorizations of the inverse Hessian approximations to make available a sequence of convergent positive bases required by the pattern search process. The algorithm, in principle, resembles that developed in [1] with multi-step methods dominating the dervation and with numerical improvements incurred, as shown by the numerical results presented herein.

Diagnostics for Regression with Finite-Order Autoregressive Disturbances

  • Lee, Young-Hoon;Jeong, Dong-Bin;Kim, Soon-Kwi
    • Journal of the Korean Statistical Society
    • /
    • 제31권2호
    • /
    • pp.237-250
    • /
    • 2002
  • Motivated by Cook's (1986) assessment of local influence by investigating the curvature of a surface associated with the overall discrepancy measure, this paper extends this idea to the linear regression model with AR(p) disturbances. Diagnostic for the linear regression models with AR(p) disturbances are discussed when simultaneous perturbations of the response vector are allowed. For the derived criterion, numerical studies demonstrate routine application of this work.

SCALING METHODS FOR QUASI-NEWTON METHODS

  • MOGHRABI, ISSAM A.R.
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • 제6권1호
    • /
    • pp.91-107
    • /
    • 2002
  • This paper presents two new self-scaling variable-metric algorithms. The first is based on a known two-parameter family of rank-two updating formulae, the second employs an initial scaling of the estimated inverse Hessian which modifies the first self-scaling algorithm. The algorithms are compared with similar published algorithms, notably those due to Oren, Shanno and Phua, Biggs and with BFGS (the best known quasi-Newton method). The best of these new and published algorithms are also modified to employ inexact line searches with marginal effect. The new algorithms are superior, especially as the problem dimension increases.

  • PDF

AN UNCONDITIONALLY GRADIENT STABLE NUMERICAL METHOD FOR THE OHTA-KAWASAKI MODEL

  • Kim, Junseok;Shin, Jaemin
    • 대한수학회보
    • /
    • 제54권1호
    • /
    • pp.145-158
    • /
    • 2017
  • We present a finite difference method for solving the Ohta-Kawasaki model, representing a model of mesoscopic phase separation for the block copolymer. The numerical methods for solving the Ohta-Kawasaki model need to inherit the mass conservation and energy dissipation properties. We prove these characteristic properties and solvability and unconditionally gradient stability of the scheme by using Hessian matrices of a discrete functional. We present numerical results that validate the mass conservation, and energy dissipation, and unconditional stability of the method.

Confidence Interval Estimation Using SV in LS-SVM

  • Seok, Kyung-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제14권3호
    • /
    • pp.451-459
    • /
    • 2003
  • The present paper suggests a method to estimate confidence interval using SV(Support Vector) in LS-SVM(Least-Squares Support Vector Machine). To get the proposed method we used the fact that the values of the hessian matrix obtained by full data set and SV are not different significantly. Since the suggested method implement only SV, a part of full data, we can save computing time and memory space. Through simulation study we justified the proposed method.

  • PDF

THE CONVERGENCE OF A DUAL ALGORITHM FOR NONLINEAR PROGRAMMING

  • Zhang, Li-Wei;He, Su-Xiang
    • Journal of applied mathematics & informatics
    • /
    • 제7권3호
    • /
    • pp.719-738
    • /
    • 2000
  • A dual algorithm based on the smooth function proposed by Polyak (1988) is constructed for solving nonlinear programming problems with inequality constraints. It generates a sequence of points converging locally to a Kuhn-Tucker point by solving an unconstrained minimizer of a smooth potential function with a parameter. We study the relationship between eigenvalues of the Hessian of this smooth potential function and the parameter, which is useful for analyzing the effectiveness of the dual algorithm.

ITERATIVE METHODS FOR LARGE-SCALE CONVEX QUADRATIC AND CONCAVE PROGRAMS

  • Oh, Se-Young
    • 대한수학회논문집
    • /
    • 제9권3호
    • /
    • pp.753-765
    • /
    • 1994
  • The linearly constrained quadratic programming(QP) considered is : $$ min f(x) = c^T x + \frac{1}{2}x^T Hx $$ $$ (1) subject to A^T x \geq b,$$ where $c,x \in R^n, b \in R^m, H \in R^{n \times n)}$, symmetric, and $A \in R^{n \times n}$. If there are bounds on x, these are included in the matrix $A^T$. The Hessian matrix H may be positive definite or negative semi-difinite. For large problems H and the constraint matrix A are assumed to be sparse.

  • PDF

GLOBAL CONVERGENCE PROPERTIES OF THE MODIFIED BFGS METHOD ASSOCIATING WITH GENERAL LINE SEARCH MODEL

  • Liu, Jian-Guo;Guo, Qiang
    • Journal of applied mathematics & informatics
    • /
    • 제16권1_2호
    • /
    • pp.195-205
    • /
    • 2004
  • To the unconstrained programme of non-convex function, this article give a modified BFGS algorithm. The idea of the algorithm is to modify the approximate Hessian matrix for obtaining the descent direction and guaranteeing the efficacious of the quasi-Newton iteration pattern. We prove the global convergence properties of the algorithm associating with the general form of line search, and prove the quadratic convergence rate of the algorithm under some conditions.