• 제목/요약/키워드: quasi-interpolation operator

검색결과 2건 처리시간 0.018초

GENERALIZED SYMMETRICAL SIGMOID FUNCTION ACTIVATED NEURAL NETWORK MULTIVARIATE APPROXIMATION

  • ANASTASSIOU, GEORGE A.
    • Journal of Applied and Pure Mathematics
    • /
    • 제4권3_4호
    • /
    • pp.185-209
    • /
    • 2022
  • Here we exhibit multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or ℝN, N ∈ ℕ, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are achieved by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fréchet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by the generalized symmetrical sigmoid function. The approximations are point-wise and uniform. The related feed-forward neural network is with one hidden layer.

PARAMETRIZED GUDERMANNIAN FUNCTION RELIED BANACH SPACE VALUED NEURAL NETWORK MULTIVARIATE APPROXIMATIONS

  • GEORGE A. ANASTASSIOU
    • Journal of Applied and Pure Mathematics
    • /
    • 제5권1_2호
    • /
    • pp.69-93
    • /
    • 2023
  • Here we give multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or ℝN, N ∈ ℕ, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are derived by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fréchet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by a parametrized Gudermannian sigmoid function. The approximations are pointwise and uniform. The related feed-forward neural network is with one hidden layer.