• Title/Summary/Keyword: Unconstrained

Search Result 386, Processing Time 0.027 seconds

MODIFIED LIMITED MEMORY BFGS METHOD WITH NONMONOTONE LINE SEARCH FOR UNCONSTRAINED OPTIMIZATION

  • Yuan, Gonglin;Wei, Zengxin;Wu, Yanlin
    • Journal of the Korean Mathematical Society
    • /
    • v.47 no.4
    • /
    • pp.767-788
    • /
    • 2010
  • In this paper, we propose two limited memory BFGS algorithms with a nonmonotone line search technique for unconstrained optimization problems. The global convergence of the given methods will be established under suitable conditions. Numerical results show that the presented algorithms are more competitive than the normal BFGS method.

A NOVEL FILLED FUNCTION METHOD FOR GLOBAL OPTIMIZATION

  • Lin, Youjiang;Yang, Yongjian;Zhang, Liansheng
    • Journal of the Korean Mathematical Society
    • /
    • v.47 no.6
    • /
    • pp.1253-1267
    • /
    • 2010
  • This paper considers the unconstrained global optimization with the revised filled function methods. The minimization sequence could leave from a local minimizer to a better minimizer of the objective function through minimizing an auxiliary function constructed at the local minimizer. Some promising numerical results are also included.

Extension and Review of Restricted and Unrestricted Mixed Models in the Generalized Linear Models (GLM에서 제약과 비제약 혼합모형의 고찰 및 확장)

  • Choi, Sung-Woon
    • Proceedings of the Safety Management and Science Conference
    • /
    • 2009.04a
    • /
    • pp.185-192
    • /
    • 2009
  • The research contributes extending and reviewing of restricted (constrained) and unrestricted (unconstrained) models in GLM(Generalized Linear Models). The paper includes the methodology for finding EMS(Expected Mean Square) and $F_0$ ratio. The results can be applied to the gauge R&R(Reproducibility and Repeatability) in MSA(Measurement System Analysis).

  • PDF

Recognition of Unconstrained Handwritten Numerals using Fully-connected RNN (완전궤환 신경망을 이용한 무제약 서체 숫자 인식)

  • 원상철;배수정;최한고
    • Proceedings of the IEEK Conference
    • /
    • 1999.11a
    • /
    • pp.1007-1010
    • /
    • 1999
  • This paper describes the recognition of totally unconstrained handwritten numerals using neural networks. Neural networks with multiple output nodes have been successfully used to classify complex handwritten numerals. The recognition system consists of the preprocessing stage to extract features using Kirsch mask and the classification stage to recognize the numerals using the fully-connected recurrent neural networks (RNN). Simulation results with the numeral database of Concordia university, Montreal, Canada, are presented. The recognition system proposed in this paper outperforms other recognition systems reported on the same database.

  • PDF

CONVERGENCE OF THE NONMONOTONE PERRY-SHANNO METHOD FOR UNCONSTRAINED OPTIMIZATION

  • Ou, Yigui;Ma, Wei
    • Journal of applied mathematics & informatics
    • /
    • v.30 no.5_6
    • /
    • pp.971-980
    • /
    • 2012
  • In this paper, a method associating with one new form of nonmonotone linesearch technique is proposed, which can be regarded as a generalization of the Perry-Shanno memoryless quasi-Newton type method. Under some reasonable conditions, the global convergence of the proposed method is proven. Numerical tests show its efficiency.

A CLASS OF NONMONOTONE SPECTRAL MEMORY GRADIENT METHOD

  • Yu, Zhensheng;Zang, Jinsong;Liu, Jingzhao
    • Journal of the Korean Mathematical Society
    • /
    • v.47 no.1
    • /
    • pp.63-70
    • /
    • 2010
  • In this paper, we develop a nonmonotone spectral memory gradient method for unconstrained optimization, where the spectral stepsize and a class of memory gradient direction are combined efficiently. The global convergence is obtained by using a nonmonotone line search strategy and the numerical tests are also given to show the efficiency of the proposed algorithm.

GLOBAL CONVERGENCE PROPERTIES OF THE MODIFIED BFGS METHOD ASSOCIATING WITH GENERAL LINE SEARCH MODEL

  • Liu, Jian-Guo;Guo, Qiang
    • Journal of applied mathematics & informatics
    • /
    • v.16 no.1_2
    • /
    • pp.195-205
    • /
    • 2004
  • To the unconstrained programme of non-convex function, this article give a modified BFGS algorithm. The idea of the algorithm is to modify the approximate Hessian matrix for obtaining the descent direction and guaranteeing the efficacious of the quasi-Newton iteration pattern. We prove the global convergence properties of the algorithm associating with the general form of line search, and prove the quadratic convergence rate of the algorithm under some conditions.

CONVERGENCE OF SUPERMEMORY GRADIENT METHOD

  • Shi, Zhen-Jun;Shen, Jie
    • Journal of applied mathematics & informatics
    • /
    • v.24 no.1_2
    • /
    • pp.367-376
    • /
    • 2007
  • In this paper we consider the global convergence of a new super memory gradient method for unconstrained optimization problems. New trust region radius is proposed to make the new method converge stably and averagely, and it will be suitable to solve large scale minimization problems. Some global convergence results are obtained under some mild conditions. Numerical results show that this new method is effective and stable in practical computation.

Optimum Design of a Linear Induction Motor for Electromagnetic Pump using Genetic Algorithm (유전알고리즘을 이용한 전자기 펌프용 선형유도전동기의 최적설계)

  • Kim, Chang-Eob;Hong, Sung-Ok
    • Proceedings of the KIEE Conference
    • /
    • 2000.07b
    • /
    • pp.744-746
    • /
    • 2000
  • This paper presents an optimum design of a linear induction motor(LIM) using genetic algorithm(GA). Sequential unconstrained minimization technique(SUMT) is used to transform the nonlinear optimization with constraints to a simple unconstrained problem. The objective functions of LIM such as trust, weight are optimized and the result was applied to the design of linear induction pump.

  • PDF

GLOBAL CONVERGENCE PROPERTIES OF TWO MODIFIED BFGS-TYPE METHODS

  • Guo, Qiang;Liu, Jian-Guo
    • Journal of applied mathematics & informatics
    • /
    • v.23 no.1_2
    • /
    • pp.311-319
    • /
    • 2007
  • This article studies a modified BFGS algorithm for solving smooth unconstrained strongly convex minimization problem. The modified BFGS method is based on the new quasi-Newton equation $B_k+1{^s}_k=yk\;where\;y_k^*=yk+A_ks_k\;and\;A_k$ is a matrix. Wei, Li and Qi [WLQ] have proven that the average performance of two of those algorithms is better than that of the classical one. In this paper, we prove the global convergence of these algorithms associated to a general line search rule.