Generalized Stechkin-Marchaud Inequalities via Deep Neural Network Approximation: Theorems on Convergence, Stability, and Error Bounds

Authors

  • Nagham Ali Hussen University of Information technology and communications, College of Engineering, Mobile Computing and Communications engineering, Baghdad, Iraq

Keywords:

Stechkin-Marchaud inequality, Deep neural networks, Approximation theory, Modulus of smoothness, K-functional

Abstract

In this paper, an extension of the classical and well-known Stechkin-Marchaud inequality model based on presenting standardized theorems functionalities for deep neural network (DNN) based approximations in the ​-spaces where . Modeling on the operator approaches and modulus of the property of smoothness. In this paper, three powerful results are illustrated: (i) a convergence theorem illustrating the uniform based approximation power of the neural operators, (ii) a stability theorem consisting the robustness property over perturbations, and (iii) an obvious error bound-theorem for the linking network-complexity in order to improve the approximation quality. The proposed approach distinct the equality between two modules named K-functionals and smoothness, producing an accurate foundation for neural network approximators in the practical spaces. These findings of the theorem help for practical applications in the fields of data-driven scientific and signal-processing. Furthermore, validating of the numerical of the theoretical bounds and focusing the performance enhancements via classical operators. This paper sets a basic for bridging approximation and recent deep learning models.

References

D. Elbrächter, D. Perekrestenko, P. Grohs, and H. Bölcskei, “Deep neural network approximation theory,” IEEE Transactions on Information Theory, vol. 67, no. 5, pp. 2581–2623, 2021.

I. D. Mienye and T. G. Swart, “A comprehensive review of deep learning: Architectures, recent advances, and applications,” Information, vol. 15, no. 12, p. 755, 2024.

S. Guo, H. Tong, and G. Zhang, “Stechkin–Marchaud-type inequalities for Baskakov polynomials,” Journal of Approximation Theory, vol. 114, no. 1, pp. 33–47, 2002.

J. Wang, Y. Xue, and F. Li, “Stechkin-Marchaud-type inequalities with Jacobi weights for Bernstein operators,” Journal of Applied Mathematics & Informatics, vol. 24, nos. 1–2, pp. 343–355, 2007.

E. S. Bhaya and Z. H. Abd Al-sadaa, “Stechkin-Marchaud inequality in terms of neural networks approximation in Lp-space for 0 < p < 1,” in IOP Conference Series: Materials Science and Engineering, vol. 571, no. 1, p. 012020, Jul. 2019, IOP Publishing.

M. Mastyło, “The modulus of smoothness in metric spaces and related problems,” Potential Analysis, vol. 35, no. 4, pp. 301–328, 2011.

R. DeVore, B. Hanin, and G. Petrova, “Neural network approximation,” Acta Numerica, vol. 30, pp. 327–444, 2021.

G. Feng, “Weighted Stechkin-Marchaud-type inequalities for Baskakov-Durmeyer operators,” in 2011 International Conference on Multimedia Technology, pp. 6168–6170, 2011.

M. A. Mursaleen, Positive Linear Operators and Approximation Properties, Ph.D. dissertation, Univ. Newcastle, 2022.

M. Unser, “A note on BIBO stability,” IEEE Transactions on Signal Processing, vol. 68, pp. 5904–5913, 2020.

T. Kato, Perturbation Theory for Linear Operators, vol. 132. Berlin, Germany: Springer, 2013.

G. Hjorth, “A boundedness lemma for iterations,” The Journal of Symbolic Logic, vol. 66, no. 3, pp. 1058–1072, 2001.

A. Damle and Y. Sun, “Uniform bounds for invariant subspace perturbations,” SIAM Journal on Matrix Analysis and Applications, vol. 41, no. 3, pp. 1208–1236, 2020.

W. Xiao, S. Li, and H. Liu, “Generalized partially functional linear model with unknown link function,” Axioms, vol. 12, no. 12, p. 1089, 2023.

K. Hornik, M. Stinchcombe, and H. White, “Multilayer feedforward networks are universal approximators,” Neural Networks, vol. 2, no. 5, pp. 359–366, 1989.

Downloads

Published

2026-03-13

How to Cite

Hussen, N. A. . (2026). Generalized Stechkin-Marchaud Inequalities via Deep Neural Network Approximation: Theorems on Convergence, Stability, and Error Bounds. CENTRAL ASIAN JOURNAL OF MATHEMATICAL THEORY AND COMPUTER SCIENCES, 7(2), 183–189. Retrieved from https://cajmtcs.casjournal.org/index.php/CAJMTCS/article/view/900

Issue

Section

Articles