• Title/Summary/Keyword: multivariate modulus of continuity

Search Result 2, Processing Time 0.019 seconds

GENERALIZED SYMMETRICAL SIGMOID FUNCTION ACTIVATED NEURAL NETWORK MULTIVARIATE APPROXIMATION

  • ANASTASSIOU, GEORGE A.
    • Journal of Applied and Pure Mathematics
    • /
    • v.4 no.3_4
    • /
    • pp.185-209
    • /
    • 2022
  • Here we exhibit multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or ℝN, N ∈ ℕ, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are achieved by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fréchet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by the generalized symmetrical sigmoid function. The approximations are point-wise and uniform. The related feed-forward neural network is with one hidden layer.

PARAMETRIZED GUDERMANNIAN FUNCTION RELIED BANACH SPACE VALUED NEURAL NETWORK MULTIVARIATE APPROXIMATIONS

  • GEORGE A. ANASTASSIOU
    • Journal of Applied and Pure Mathematics
    • /
    • v.5 no.1_2
    • /
    • pp.69-93
    • /
    • 2023
  • Here we give multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or ℝN, N ∈ ℕ, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are derived by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fréchet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by a parametrized Gudermannian sigmoid function. The approximations are pointwise and uniform. The related feed-forward neural network is with one hidden layer.