Efficient Approximation to Analytic and $L^p$ functions by Height-Augmented ReLU Networks

Authors: ZeYu Li, FengLei Fan, TieYong Zeng

Year: 2026

stat.MLcs.LGcs.NE

0
Citations
2026
Published
3
Authors

Abstract

This work addresses two fundamental limitations in neural network approximation theory. We demonstrate that a three-dimensional network architecture enables a significantly more efficient representation of sawtooth functions, which serves as the cornerstone in the approximation of analytic and $L^p$ functions. First, we establish substantially improved exponential approximation rates for several important classes of analytic functions and offer a parameter-efficient network design. Second, for the first time, we derive a quantitative and non-asymptotic approximation of high orders for general $L^p$ functions. Our techniques advance the theoretical understanding of the neural network approximation in fundamental function spaces and offer a theoretically grounded pathway for designing more parameter-efficient networks.

Read PDF