• 제목/요약/키워드: multivariate neural network approximation

검색결과 6건 처리시간 0.018초

GENERALIZED SYMMETRICAL SIGMOID FUNCTION ACTIVATED NEURAL NETWORK MULTIVARIATE APPROXIMATION

  • ANASTASSIOU, GEORGE A.
    • Journal of Applied and Pure Mathematics
    • /
    • 제4권3_4호
    • /
    • pp.185-209
    • /
    • 2022
  • Here we exhibit multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or ℝN, N ∈ ℕ, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are achieved by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fréchet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by the generalized symmetrical sigmoid function. The approximations are point-wise and uniform. The related feed-forward neural network is with one hidden layer.

PARAMETRIZED GUDERMANNIAN FUNCTION RELIED BANACH SPACE VALUED NEURAL NETWORK MULTIVARIATE APPROXIMATIONS

  • GEORGE A. ANASTASSIOU
    • Journal of Applied and Pure Mathematics
    • /
    • 제5권1_2호
    • /
    • pp.69-93
    • /
    • 2023
  • Here we give multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or ℝN, N ∈ ℕ, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are derived by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fréchet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by a parametrized Gudermannian sigmoid function. The approximations are pointwise and uniform. The related feed-forward neural network is with one hidden layer.

DEGREE OF APPROXIMATION BY KANTOROVICH-CHOQUET QUASI-INTERPOLATION NEURAL NETWORK OPERATORS REVISITED

  • GEORGE A., ANASTASSIOU
    • Journal of Applied and Pure Mathematics
    • /
    • 제4권5_6호
    • /
    • pp.269-286
    • /
    • 2022
  • In this article we exhibit univariate and multivariate quantitative approximation by Kantorovich-Choquet type quasi-interpolation neural network operators with respect to supremum norm. This is done with rates using the first univariate and multivariate moduli of continuity. We approximate continuous and bounded functions on ℝN , N ∈ ℕ. When they are also uniformly continuous we have pointwise and uniform convergences. Our activation functions are induced by the arctangent, algebraic, Gudermannian and generalized symmetrical sigmoid functions.

효율적 고차 신경회로망을 이용한 비선형 함수 근사에 대한 연구 (Nonlinear Function Approximation Using Efficient Higher-order Feedforward Neural Networks)

  • 신요안
    • 한국통신학회논문지
    • /
    • 제21권1호
    • /
    • pp.251-268
    • /
    • 1996
  • In this paper, a higher-order feedforward neural network called ridge polynomial network (RPN) which shows good approximation capability for nonlnear continuous functions defined on compact subsets in multi-dimensional Euclidean spaces, is presented. This network provides more efficient and regular structure as compared to ordinary higher-order feedforward networks based on Gabor-Kolmogrov polynomial expansions, while maintating their fast learning property. the ridge polynomial network is a generalization of the pi-sigma network (PSN) and uses a specialform of ridge polynomials. It is shown that any multivariate polynomial can be exactly represented in this form, and thus realized by a RPN. The approximation capability of the RPNs for arbitrary continuous functions is shown by this representation theorem and the classical weierstrass polynomial approximation theorem. The RPN provides a natural mechanism for incremental function approximation based on learning algorithm of the PSN. Simulation results on several applications such as multivariate function approximation and pattern classification assert nonlinear approximation capability of the RPN.

  • PDF

다변량 신경망 모형을 이용한 대청댐 유입량 산정에 관한 연구 (A Study of Predictive method of Daechung Dam Inflow Using Multivariate Neural Network Model)

  • 강권수;염경택;허준행
    • 한국수자원학회:학술대회논문집
    • /
    • 한국수자원학회 2012년도 학술발표회
    • /
    • pp.359-362
    • /
    • 2012
  • 수자원시스템의 설계, 계획, 운영에 있어 핵심적인 수문변수의 미래거동에 대한 보다 나은 추정치가 필요하다. 예를 들어, 수력발전, 레크리에이션 이용과 하류지역의 오염희석과 같은 다중 목적 기능을 유지하기 위하여 다목적댐을 운영할 때에, 다가오는 미래시간에 대한 계획된 유량의 예측이 요구된다. 예측의 목적은 미래에 발생할 정확한 예상치를 제공하는 것이다(Keith W. Hipel, 1994). 본 연구의 주요 목적은 금강수계인 대청댐에서 다변량 신경망 모형을 이용한 유입량 예측을 수행해 보는데 있다. 신경망 모형인 MLP, PCA, RBF모형 등을 이용하여 대청댐의 수문자료인 강우량, 유입량, 기온, 습도 등의 자료를 이용하여 최적의 모형을 탐색해 보고자 시도하였으며, 이중 New classification모형과 New Function Approximation Network모형이 타 모형보다 좋은 결과를 보여 주고 있음을 알 수 있었다.

  • PDF

비선형 특징추출 기법에 의한 머리전달함수(HRTF)의 저차원 모델링 및 합성 (Low Dimensional Modeling and Synthesis of Head-Related Transfer Function (HRTF) Using Nonlinear Feature Extraction Methods)

  • 서상원;김기홍;김현석;김현빈;이의택
    • 한국정보처리학회논문지
    • /
    • 제7권5호
    • /
    • pp.1361-1369
    • /
    • 2000
  • For the implementation of 3D Sound Localization system, the binaural filtering by HRTFs is generally employed. But the HRTF filter is of high order and its coefficients for all directions have to be stored, which imposes a rather large memory requirement. To cope with this, research works have centered on obtaining low dimensional HRTF representations without significant loss of information and synthesizing the original HRTF efficiently, by means of feature extraction methods for multivariate dat including PCA. In these researches, conventional linear PCA was applied to the frequency domain HRTF data and using relatively small number of principal components the original HRTFs could be synthesized in approximation. In this paper we applied neural network based nonlinear PCA model (NLPCA) and the nonlinear PLS repression model (NLPLS) for this low dimensional HRTF modeling and analyze the results in comparison with the PCA. The NLPCA that performs projection of data onto the nonlinear surfaces showed the capability of more efficient HRTF feature extraction than linear PCA and the NLPLS regression model that incorporates the direction information in feature extraction yielded more stable results in synthesizing general HRTFs not included in the model training.

  • PDF