An International Journal

ISSN: 2582-0818

Home 9 Keyword: Banach space valued quasi-interpolation operator
Banach space valued quasi-interpolation operator
Open AccessArticle

q-Deformed and β-parametrized half hyperbolic tangent based Banach space valued ordinary and fractional neural network approximation

Annals of Communications in Mathematics 2023

, 6 (1)

, 1-16

DOI: https://doi.org/10.62072/acm2023060101

AbstractHere we research the univariate quantitative approximation, ordinary and fractional, of Banach space valued continuous functions on a compact interval or all the real line by quasi-interpolation Banach space valued neural network operators. These approximations are derived by establishing Jackson type inequalities involving the modulus of continuity of the engaged function or its Banach space valued high order derivative of fractional derivatives. Our operators are defined by using a density function generated by a q-deformed and β-parametrized half hyperbolic tangent function, which is a sigmoid function. The approximations are pointwise and of the uniform norm. The related Banach space valued feed-forward neural networks are with one hidden layer.
Open AccessArticle

Parametrized error function based Banach space valued univariate neural network approximation

Annals of Communications in Mathematics 2023

, 6 (1)

, 31-43

DOI: https://doi.org/10.62072/acm2023060104

AbstractHere we research the univariate quantitative approximation of Banach space valued continuous functions on a compact interval or all the real line by quasi-interpolation Banach space valued neural network operators. We perform also the related Banach space valued ractional approximation. These approximations are derived by establishing Jackson type inequalities involving the modulus of continuity of the engaged function or its Banach space valued high order derivative or fractional derivaties. Our operators are defined by using a density function induced by a parametrized error function. The approximations are pointwise and with respect to the uniform norm. The related Banach space valued feed-forward neural networks are with one hidden layer. We finish with a convergence analysis.
Open AccessArticle

General Multiple Sigmoid Functions Relied Complex Valued Multivariate Trigonometric and Hyperbolic Neural Network Approximations

Annals of Communications in Mathematics 2025

, 8 (1)

, 80-102

DOI: https://doi.org/10.62072/acm.2025.080107

AbstractHere we research the multivariate quantitative approximation of complex valued continuous functions on a box of RN , N ∈ N, by the multivariate normalized type neural network operators. We investigate also the case of approximation by iterated multilayer neural network operators. These approximations are achieved by establishing multidimensional Jackson type inequalities involving the multivariate moduli of continuity of the engaged function and its partial derivatives. Our multivariate operators are defined by using a multidimensional density function induced by general multiple sigmoid func- tions. The approximations are pointwise and uniform. The related feed-forward neural network are with one or multi hidden layers. The basis of our theory are the introduced multivariate Taylor formulae of trigonometric and hyperbolic type.