Home 9 Volume 9 q-Deformed and L-parametrized hyperbolic tangent function relied complex valued multivariate trigonometric and hyperbolic neural network approximations
Open AccessArticle
q-Deformed and L-parametrized hyperbolic tangent function relied complex valued multivariate trigonometric and hyperbolic neural network approximations

Department of Mathematical Sciences, University of Memphis, Memphis, TN 38152, U.S.A.

* Corresponding Author
Annals of Communications in Mathematics 2023
, 6 (3),
141-164.
https://doi.org/10.62072/acm.2023.060301
Received: 13 June 2023 |
Accepted: 15 September 2023 |
Published: 31 October 2023

Abstract
Here we study the multivariate quantitative approximation of complex valued continuous functions on a box of RN , N ∈ N, by the multivariate normalized type neural network operators. We investigate also the case of approximation by iterated multilayer neural network operators. These approximations are achieved by establishing multidimen-sional Jackson type inequalities involving the multivariate moduli of continuity of the en- gaged function and its partial derivatives. Our multivariate operators are defined by using a multidimensional density function induced by a q-deformed and λ-parametrized hyper-bolic tangent function, which is a sigmoid function. The approximations are pointwise and uniform. The related feed-forward neural network are with one or multi hidden layers. The basis of our theory are the introduced multivariate Taylor formulae of trigonometric and hyperbolic type.

Keywords

Cite This Article

George A. Anastassiou.
q-Deformed and L-parametrized hyperbolic tangent function relied complex valued multivariate trigonometric and hyperbolic neural network approximations.

Annals of Communications in Mathematics,

2023,
6 (3):
141-164.
https://doi.org/10.62072/acm.2023.060301
References

[1] G.A. Anastassiou, Rate of convergence of some neural network operators to the unit-univariate case, J. Math. Anal. Appli. 212 (1997), 237-262.
[2] G.A. Anastassiou, Quantitative Approximations, Chapman&Hall/CRC, Boca Raton, New York, 2001.
[3] G.A. Anastassiou, Inteligent Systems: Approximation by Artificial Neural Networks, Intelligent Systems Reference Library, Vol. 19, Springer, Heidelberg, 2011.
[4] G.A. Anastassiou, Univariate hyperbolic tangent neural network approximation, Mathematics and Computer Modelling, 53(2011), 1111-1132.
[5] G.A. Anastassiou, Multivariate hyperbolic tangent neural network approximation, Computers and Mathematics 61(2011), 809-821.
[6] G.A. Anastassiou, Multivariate sigmoidal neural network approximation, Neural Networks 24(2011), 378-386.
[7] G.A. Anastassiou, Univariate sigmoidal neural network approximation, J. of Computational Analysis and Applications, Vol. 14, No. 4, 2012, 659-690.
[8] G.A. Anastassiou, Approximation by neural networks iterates, Advances in Applied Mathematics and Approximation Theory, pp. 1-20, Springer Proceedings in Math. & Stat., Springer, New York, 2013, Eds. G. Anastassiou, O. Duman.
[9] G. Anastassiou, Intelligent Systems II: Complete Approximation by Neural Network Operators, Springer, Heidelberg, New York, 2016.
[10] G.A. Anastassiou, Intelligent Computations: Abstract Fractional Calculus, Inequalities, Approximations, Springer, Heidelberg, New York, 2018.
[11] G.A. Anastassiou, General sigmoid based Banach space valued neural network approximation, J. of Computational Analysis and Applications, 31(4) (2023), 520-534.
[12] G.A. Anastassiou, Parametrized, deformed and general neural networs, accepted for publication, Springer, Heidelberg, New York, 2023.
[13] G.A. Anastassiou, Opial and Ostrowski type inequalities based on trigonometric and hyperbolic type Taylor formulae, submitted, 2023.
[14] Z. Chen and F. Cao, The approximation operators with sigmoidal functions, Computers and Mathematics with Applications, 58 (2009), 758-765.
[15] D. Costarelli, R. Spigler, Approximation results for neural network operators activated by sigmoidal functions, Neural Networks 44 (2013), 101-106.
[16] D. Costarelli, R. Spigler, Multivariate neural network operators with sigmoidal activation functions, Neural Networks 48 (2013), 72-77.
[17] S. Haykin, Neural Networks: A Comprehensive Foundation (2 ed.), Prentice Hall, New York, 1998.
[18] W. McCulloch and W. Pitts, A logical calculus of the ideas immanent in nervous activity, Bulletin of Mathematical Biophysics, 7 (1943), 115-133.
[19] T.M. Mitchell, Machine Learning, WCB-McGraw-Hill, New York, 1997

  • Creative Commons License
  • Copyright (c) 2023 by the Author(s). Licensee Techno Sky Publications. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

    0 Comments

    Submit a Comment

    Your email address will not be published. Required fields are marked *

    Preview PDF

    XML File

    Loading

    Share