Q-deformed and λ-parametrized hyperbolic tangent function

Document Type: Research Paper


George A. Anastassiou
Department of Mathematical Sciences, University of Memphis, Memphis, TN 38152, U.S.A.


Here we study the multivariate quantitative approximation of complex valued continuous functions on a box of RN , N ∈ N, by the multivariate normalized type neural network operators. We investigate also the case of approximation by iterated multilayer neural network operators. These approximations are achieved by establishing multidimen-sional Jackson type inequalities involving the multivariate moduli of continuity of the en- gaged function and its partial derivatives. Our multivariate operators are defined by using a multidimensional density function induced by a q-deformed and λ-parametrized hyper-bolic tangent function, which is a sigmoid function. The approximations are pointwise and uniform. The related feed-forward neural network are with one or multi hidden layers. The basis of our theory are the introduced multivariate Taylor formulae of trigonometric and hyperbolic type.


Multi layer approximation; q-Qeformed and λ-parametrized hyperbolic tangent function; Multivariate trigonometric and hyperbolic neural network approximation; Quasi-interpolation operator; Multivariate modulus of continuity; Iterated approximation.


[1] G.A. Anastassiou, Rate of convergence of some neural network operators to the unit-univariate case, J.
Math. Anal. Appli. 212 (1997), 237-262.
[2] G.A. Anastassiou, Quantitative Approximations, Chapman&Hall/CRC, Boca Raton, New York, 2001.
[3] G.A. Anastassiou, Inteligent Systems: Approximation by Artificial Neural Networks, Intelligent Systems
Reference Library, Vol. 19, Springer, Heidelberg, 2011.
[4] G.A. Anastassiou, Univariate hyperbolic tangent neural network approximation, Mathematics and Computer
Modelling, 53(2011), 1111-1132.
[5] G.A. Anastassiou, Multivariate hyperbolic tangent neural network approximation, Computers and Mathematics 61(2011), 809-821.
[6] G.A. Anastassiou, Multivariate sigmoidal neural network approximation, Neural Networks 24(2011), 378-
[7] G.A. Anastassiou, Univariate sigmoidal neural network approximation, J. of Computational Analysis and
Applications, Vol. 14, No. 4, 2012, 659-690.
[8] G.A. Anastassiou, Approximation by neural networks iterates, Advances in Applied Mathematics and Approximation Theory, pp. 1-20, Springer Proceedings in Math. & Stat., Springer, New York, 2013, Eds. G.
Anastassiou, O. Duman.
[9] G. Anastassiou, Intelligent Systems II: Complete Approximation by Neural Network Operators, Springer,
Heidelberg, New York, 2016.
[10] G.A. Anastassiou, Intelligent Computations: Abstract Fractional Calculus, Inequalities, Approximations,
Springer, Heidelberg, New York, 2018.
[11] G.A. Anastassiou, General sigmoid based Banach space valued neural network approximation, J. of Computational Analysis and Applications, 31(4) (2023), 520-534.
[12] G.A. Anastassiou, Parametrized, deformed and general neural networs, accepted for publication, Springer,
Heidelberg, New York, 2023.
[13] G.A. Anastassiou, Opial and Ostrowski type inequalities based on trigonometric and hyperbolic type Taylor
formulae, submitted, 2023.
[14] Z. Chen and F. Cao, The approximation operators with sigmoidal functions, Computers and Mathematics
with Applications, 58 (2009), 758-765.
[15] D. Costarelli, R. Spigler, Approximation results for neural network operators activated by sigmoidal functions, Neural Networks 44 (2013), 101-106.
[16] D. Costarelli, R. Spigler, Multivariate neural network operators with sigmoidal activation functions, Neural
Networks 48 (2013), 72-77.
[17] S. Haykin, Neural Networks: A Comprehensive Foundation (2 ed.), Prentice Hall, New York, 1998.
[18] W. McCulloch and W. Pitts, A logical calculus of the ideas immanent in nervous activity, Bulletin of Mathematical Biophysics, 7 (1943), 115-133.
[19] T.M. Mitchell, Machine Learning, WCB-McGraw-Hill, New York, 1997

Volume 6, Issue 3
October 2023
Pages 141-164






Copyright (c) 2023 Authors

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.