• Title/Summary/Keyword: gradient descent optimization

Search Result 82, Processing Time 0.029 seconds

FIRST ORDER GRADIENT OPTIMIZATION IN LISP

  • Stanimirovic, Predrag;Rancic, Svetozar
    • Journal of applied mathematics & informatics
    • /
    • v.5 no.3
    • /
    • pp.701-716
    • /
    • 1998
  • In this paper we develop algorithms in programming lan-guage SCHEME for implementation of the main first order gradient techniques for unconstrained optimization. Implementation of the de-scent techniques which use non-optimal descent steps as well as imple-mentation of the optimal descent techniques are described. Also we investigate implementation of the global problem called optimization along a line. Developed programs are effective and simpler with re-spect to the corresponding in the procedural programming languages. Several numerical examples are reported.

A new optimization method for improving the performance of neural networks for optimization (최적화용 신경망의 성능개선을 위한 새로운 최적화 기법)

  • 조영현
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.34C no.12
    • /
    • pp.61-69
    • /
    • 1997
  • This paper proposes a new method for improving the performances of the neural network for optimization using a hyubrid of gradient descent method and dynamic tunneling system. The update rule of gradient descent method, which has the fast convergence characteristic, is applied for high-speed optimization. The update rule of dynamic tunneling system, which is the deterministic method with a tunneling phenomenon, is applied for global optimization. Having converged to the for escaping the local minima by applying the dynamic tunneling system. The proposed method has been applied to the travelling salesman problems and the optimal task partition problems to evaluate to that of hopfield model using the update rule of gradient descent method.

  • PDF

Gradient Descent Training Method for Optimizing Data Prediction Models (데이터 예측 모델 최적화를 위한 경사하강법 교육 방법)

  • Hur, Kyeong
    • Journal of Practical Engineering Education
    • /
    • v.14 no.2
    • /
    • pp.305-312
    • /
    • 2022
  • In this paper, we focused on training to create and optimize a basic data prediction model. And we proposed a gradient descent training method of machine learning that is widely used to optimize data prediction models. It visually shows the entire operation process of gradient descent used in the process of optimizing parameter values required for data prediction models by applying the differential method and teaches the effective use of mathematical differentiation in machine learning. In order to visually explain the entire operation process of gradient descent, we implement gradient descent SW in a spreadsheet. In this paper, first, a two-variable gradient descent training method is presented, and the accuracy of the two-variable data prediction model is verified by comparison with the error least squares method. Second, a three-variable gradient descent training method is presented and the accuracy of a three-variable data prediction model is verified. Afterwards, the direction of the optimization practice for gradient descent was presented, and the educational effect of the proposed gradient descent method was analyzed through the results of satisfaction with education for non-majors.

Nonlinear optimization algorithm using monotonically increasing quantization resolution

  • Jinwuk Seok;Jeong-Si Kim
    • ETRI Journal
    • /
    • v.45 no.1
    • /
    • pp.119-130
    • /
    • 2023
  • We propose a quantized gradient search algorithm that can achieve global optimization by monotonically reducing the quantization step with respect to time when quantization is composed of integer or fixed-point fractional values applied to an optimization algorithm. According to the white noise hypothesis states, a quantization step is sufficiently small and the quantization is well defined, the round-off error caused by quantization can be regarded as a random variable with identically independent distribution. Thus, we rewrite the searching equation based on a gradient descent as a stochastic differential equation and obtain the monotonically decreasing rate of the quantization step, enabling the global optimization by stochastic analysis for deriving an objective function. Consequently, when the search equation is quantized by a monotonically decreasing quantization step, which suitably reduces the round-off error, we can derive the searching algorithm evolving from an optimization algorithm. Numerical simulations indicate that due to the property of quantization-based global optimization, the proposed algorithm shows better optimization performance on a search space to each iteration than the conventional algorithm with a higher success rate and fewer iterations.

GLOBAL CONVERGENCE OF AN EFFICIENT HYBRID CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION

  • Liu, Jinkui;Du, Xianglin
    • Bulletin of the Korean Mathematical Society
    • /
    • v.50 no.1
    • /
    • pp.73-81
    • /
    • 2013
  • In this paper, an efficient hybrid nonlinear conjugate gradient method is proposed to solve general unconstrained optimization problems on the basis of CD method [2] and DY method [5], which possess the following property: the sufficient descent property holds without any line search. Under the Wolfe line search conditions, we proved the global convergence of the hybrid method for general nonconvex functions. The numerical results show that the hybrid method is especially efficient for the given test problems, and it can be widely used in scientific and engineering computation.

A NEW CLASS OF NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION MODELS AND ITS APPLICATION IN PORTFOLIO SELECTION

  • Malik, Maulana;Sulaiman, Ibrahim Mohammed;Mamat, Mustafa;Abas, Siti Sabariah;Sukono, Sukono
    • Nonlinear Functional Analysis and Applications
    • /
    • v.26 no.4
    • /
    • pp.811-837
    • /
    • 2021
  • In this paper, we propose a new conjugate gradient method for solving unconstrained optimization models. By using exact and strong Wolfe line searches, the proposed method possesses the sufficient descent condition and global convergence properties. Numerical results show that the proposed method is efficient at small, medium, and large dimensions for the given test functions. In addition, the proposed method was applied to solve practical application problems in portfolio selection.

L1-penalized AUC-optimization with a surrogate loss

  • Hyungwoo Kim;Seung Jun Shin
    • Communications for Statistical Applications and Methods
    • /
    • v.31 no.2
    • /
    • pp.203-212
    • /
    • 2024
  • The area under the ROC curve (AUC) is one of the most common criteria used to measure the overall performance of binary classifiers for a wide range of machine learning problems. In this article, we propose a L1-penalized AUC-optimization classifier that directly maximizes the AUC for high-dimensional data. Toward this, we employ the AUC-consistent surrogate loss function and combine the L1-norm penalty which enables us to estimate coefficients and select informative variables simultaneously. In addition, we develop an efficient optimization algorithm by adopting k-means clustering and proximal gradient descent which enjoys computational advantages to obtain solutions for the proposed method. Numerical simulation studies demonstrate that the proposed method shows promising performance in terms of prediction accuracy, variable selectivity, and computational costs.

GLOBAL CONVERGENCE OF A NEW SPECTRAL PRP CONJUGATE GRADIENT METHOD

  • Liu, Jinkui
    • Journal of applied mathematics & informatics
    • /
    • v.29 no.5_6
    • /
    • pp.1303-1309
    • /
    • 2011
  • Based on the PRP method, a new spectral PRP conjugate gradient method has been proposed to solve general unconstrained optimization problems which produce sufficient descent search direction at every iteration without any line search. Under the Wolfe line search, we prove the global convergence of the new method for general nonconvex functions. The numerical results show that the new method is efficient for the given test problems.

Optimization Inverse Design Technique for Fluid Machinery Impellers (유체기계 임펠러의 최적 역설계 기법)

  • Kim J. S.;Park W. G.
    • Journal of computational fluids engineering
    • /
    • v.3 no.1
    • /
    • pp.37-45
    • /
    • 1998
  • A new and efficient inverse design method based on the numerical optimization technique has been developed. The 2-D incompressible Navier-Stokes equations are solved for obtaining the objective functions and coupled with the optimization procedure to perform the inverse design. The steepest descent and the conjugate gradient method have been applied to find the searching direction. The golden section method was applied to compute the design variable intervals. It has been found that the airfoil and the pump impellers are well converged to their targeting shapes.

  • PDF

A study on the design optimization of baseframe to avoid resonance of diesel generator set (발전기세트 공진 회피를 위한 베이스프레임 최적설계에 관한 연구)

  • Jeong, S.H.;Kwak, Y.S.;Kim, W.H.
    • Proceedings of the Korean Society for Noise and Vibration Engineering Conference
    • /
    • 2012.04a
    • /
    • pp.157-162
    • /
    • 2012
  • A structural modification of baseframe is an effective method to avoid resonance in marine diesel generator (D/G) set which consists of diesel engine, generator and baseframe. However the reinforcement with thick plates or additional parts to increase the natural frequency can be less effective because of increased weight. Especially fine control of target mode based on the experience is difficult because the weight and interference of system have to be considered. In this paper, the design optimization of baseframe was performed to reduce the resonant vibration using a gradient descent method. The design parameters such as thickness, shape and location of baseframe parts are optimized to increase the torsional natural frequency of D/G set. From the actual test, the new designed baseframe reduced the vibration level in resonance by 55% without any increase of weight and interference. interference.

  • PDF