• Title/Summary/Keyword: regularization method

Search Result 301, Processing Time 0.026 seconds

How to improve oil consumption forecast using google trends from online big data?: the structured regularization methods for large vector autoregressive model

  • Choi, Ji-Eun;Shin, Dong Wan
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.1
    • /
    • pp.41-51
    • /
    • 2022
  • We forecast the US oil consumption level taking advantage of google trends. The google trends are the search volumes of the specific search terms that people search on google. We focus on whether proper selection of google trend terms leads to an improvement in forecast performance for oil consumption. As the forecast models, we consider the least absolute shrinkage and selection operator (LASSO) regression and the structured regularization method for large vector autoregressive (VAR-L) model of Nicholson et al. (2017), which select automatically the google trend terms and the lags of the predictors. An out-of-sample forecast comparison reveals that reducing the high dimensional google trend data set to a low-dimensional data set by the LASSO and the VAR-L models produces better forecast performance for oil consumption compared to the frequently-used forecast models such as the autoregressive model, the autoregressive distributed lag model and the vector error correction model.

SATURATION-VALUE TOTAL VARIATION BASED COLOR IMAGE DENOISING UNDER MIXED MULTIPLICATIVE AND GAUSSIAN NOISE

  • JUNG, MIYOUN
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.26 no.3
    • /
    • pp.156-184
    • /
    • 2022
  • In this article, we propose a novel variational model for restoring color images corrupted by mixed multiplicative Gamma noise and additive Gaussian noise. The model involves a data-fidelity term that characterizes the mixed noise as an infimal convolution of two noise distributions and the saturation-value total variation (SVTV) regularization. The data-fidelity term facilitates suitable separation of the multiplicative Gamma and Gaussian noise components, promoting simultaneous elimination of the mixed noise. Furthermore, the SVTV regularization enables adequate denoising of homogeneous regions, while maintaining edges and details and diminishing the color artifacts induced by noise. To solve the proposed nonconvex model, we exploit an alternating minimization approach, and then the alternating direction method of multipliers is adopted for solving subproblems. This contributes to an efficient iterative algorithm. The experimental results demonstrate the superior performance of the proposed model compared to other existing or related models, with regard to visual inspection and image quality measurements.

STRONG CONVERGENCE OF A METHOD FOR VARIATIONAL INEQUALITY PROBLEMS AND FIXED POINT PROBLEMS OF A NONEXPANSIVE SEMIGROUP IN HILBERT SPACES

  • Buong, Nguyen
    • Journal of applied mathematics & informatics
    • /
    • v.29 no.1_2
    • /
    • pp.61-74
    • /
    • 2011
  • In this paper, we introduce a new iteration method based on the hybrid method in mathematical programming and the descent-like method for finding a common element of the solution set for a variational inequality and the set of common fixed points of a nonexpansive semigroup in Hilbert spaces. We obtain a strong convergence for the sequence generated by our method in Hilbert spaces. The result in this paper modifies and improves some well-known results in the literature for a more general problem.

A MEMORY EFFICIENT INCREMENTAL GRADIENT METHOD FOR REGULARIZED MINIMIZATION

  • Yun, Sangwoon
    • Bulletin of the Korean Mathematical Society
    • /
    • v.53 no.2
    • /
    • pp.589-600
    • /
    • 2016
  • In this paper, we propose a new incremental gradient method for solving a regularized minimization problem whose objective is the sum of m smooth functions and a (possibly nonsmooth) convex function. This method uses an adaptive stepsize. Recently proposed incremental gradient methods for a regularized minimization problem need O(mn) storage, where n is the number of variables. This is the drawback of them. But, the proposed new incremental gradient method requires only O(n) storage.

A REGULARIZED CORRECTION METHOD FOR ELLIPTIC PROBLEMS WITH A SINGULAR FORCE

  • Kim, Hyea-Hyun
    • Journal of the Korean Mathematical Society
    • /
    • v.49 no.5
    • /
    • pp.927-945
    • /
    • 2012
  • An approximation of singular source terms in elliptic problems is developed and analyzed. Under certain assumptions on the curve where the singular source is defined, the second order convergence in the maximum norm can be proved. Numerical results present its better performance compared to previously developed regularization techniques.

Neural Networks which Approximate One-to-Many Mapping

  • Lee, Choon-Young;Lee, Ju-Jang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.41.5-41
    • /
    • 2001
  • A novel method is introduced for determining the weights of a regularization network which approximates one-to-many mapping. A conventional neural network will converges to the average value when outputs are multiple for one input. The capability of proposed network is demonstrated by an example of learning inverse mapping.

  • PDF

AN AUTOMATIC AUGMENTED GALERKIN METHOD FOR SINGULAR INTEGRAL EQUATIONS WITH HILBERT KERNEL

  • Abbasbandy, S.;Babolian, E.
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.2
    • /
    • pp.429-437
    • /
    • 2001
  • In [1, 2], described a Chebyshev series method for the numerical solution of integral equations with three automatic algorithms for computing tow regularization parameters, C/sub f/ and r. Here we describe a Fourier series expansion method for a class singular integral equations with Hilbert kernel and constant coefficients with using a new automatic algorithm.

RESTORATION OF BLURRED IMAGES BY GLOBAL LEAST SQUARES METHOD

  • Chung, Sei-young;Oh, SeYoung;Kwon, SunJoo
    • Journal of the Chungcheong Mathematical Society
    • /
    • v.22 no.2
    • /
    • pp.177-186
    • /
    • 2009
  • The global least squares method (Gl-LSQR) is a generalization of LSQR method for solving linear system with multiple right hand sides. In this paper, we present how to apply this algorithm for solving the image restoration problem and illustrate the usefulness and effectiveness of this method from numerical experiments.

  • PDF

IMAGE DEBLURRING USING GLOBAL PCG METHOD WITH KRONECKER PRODUCT PRECONDITIONER

  • KIM, KYOUM SUN;YUN, JAE HEON
    • Journal of applied mathematics & informatics
    • /
    • v.36 no.5_6
    • /
    • pp.531-540
    • /
    • 2018
  • We first show how to construct the linear operator equations corresponding to Tikhonov regularization problems for solving image deblurring problems with nearly separable point spread functions. We next propose a Kronecker product preconditioner which is suitable for the global PCG method. Lastly, we provide numerical experiments of the global PCG method with the Kronecker product preconditioner for several image deblurring problems to evaluate its effectiveness.

TWO DIMENSIONAL VERSION OF LEAST SQUARES METHOD FOR DEBLURRING PROBLEMS

  • Kwon, SunJoo;Oh, SeYoung
    • Journal of the Chungcheong Mathematical Society
    • /
    • v.24 no.4
    • /
    • pp.895-903
    • /
    • 2011
  • A two dimensional version of LSQR iterative algorithm which takes advantages of working solely with the 2-dimensional arrays is developed and applied to the image deblurring problem. The efficiency of the method comparing to the Fourier-based LSQR method and the 2-D version CGLS algorithm methods proposed by Hanson ([4]) is analyzed.