TY - JOUR
T1 - Depth separation beyond radial functions
AU - Venturi, Luca
AU - Jelassi, Samy
AU - Ozuch, Tristan
AU - Bruna, Joan
N1 - Publisher Copyright:
© 2022 Luca Venturi, Samy Jelassi, Tristan Ozuch and Joan Bruna.
PY - 2022
Y1 - 2022
N2 - High-dimensional depth separation results for neural networks show that certain functions can be efficiently approximated by two-hidden-layer networks but not by one-hidden-layer ones in high-dimensions. Existing results of this type mainly focus on functions with an underlying radial or one-dimensional structure, which are usually not encountered in practice. The first contribution of this paper is to extend such results to a more general class of functions, namely functions with piece-wise oscillatory structure, by building on the proof strategy of (Eldan and Shamir, 2016). We complement these results by showing that, if the domain radius and the rate of oscillation of the objective function are constant, then approximation by one-hidden-layer networks holds at a poly(d) rate for any fixed error threshold. The mentioned results show that one-hidden-layer networks fail to approximate highenergy functions whose Fourier representation is spread in the frequency domain, while they succeed at approximating functions having a sparse Fourier representation. However, the choice of the domain represents a source of gaps between these positive and negative approximation results. We conclude the paper focusing on a compact approximation domain, namely the sphere Sd-1 in dimension d, where we provide a characterization of both functions which are efficiently approximable by one-hidden-layer networks and of functions which are provably not, in terms of their Fourier expansion.
AB - High-dimensional depth separation results for neural networks show that certain functions can be efficiently approximated by two-hidden-layer networks but not by one-hidden-layer ones in high-dimensions. Existing results of this type mainly focus on functions with an underlying radial or one-dimensional structure, which are usually not encountered in practice. The first contribution of this paper is to extend such results to a more general class of functions, namely functions with piece-wise oscillatory structure, by building on the proof strategy of (Eldan and Shamir, 2016). We complement these results by showing that, if the domain radius and the rate of oscillation of the objective function are constant, then approximation by one-hidden-layer networks holds at a poly(d) rate for any fixed error threshold. The mentioned results show that one-hidden-layer networks fail to approximate highenergy functions whose Fourier representation is spread in the frequency domain, while they succeed at approximating functions having a sparse Fourier representation. However, the choice of the domain represents a source of gaps between these positive and negative approximation results. We conclude the paper focusing on a compact approximation domain, namely the sphere Sd-1 in dimension d, where we provide a characterization of both functions which are efficiently approximable by one-hidden-layer networks and of functions which are provably not, in terms of their Fourier expansion.
KW - Depth separation
KW - Neural networks
UR - http://www.scopus.com/inward/record.url?scp=85130388479&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85130388479&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:85130388479
SN - 1532-4435
VL - 23
JO - Journal of Machine Learning Research
JF - Journal of Machine Learning Research
ER -