Neural networks and rational functions

Matus Telgarsky

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Neural networks and rational functions efficiently approximate each other. In more detail, it is shown here that for any ReLU net-work, there exists a rational function of degree O(poly log(l/e)) which is e-close, and similarly for any rational function there exists a ReLU network of size ö(poly log(l/e)) which is -close. By contrast, polynomials need degree Q(poly(l/e)) to approximate even a single ReLU. When converting a ReLU network to a rational function as above, the hidden constants depend exponentially on the number of layers, which is shown to be tight; in other words, a compositional representation can be beneficial even for rational functions.

Original languageEnglish (US)
Title of host publication34th International Conference on Machine Learning, ICML 2017
PublisherInternational Machine Learning Society (IMLS)
Pages5195-5210
Number of pages16
ISBN (Electronic)9781510855144
StatePublished - 2017
Event34th International Conference on Machine Learning, ICML 2017 - Sydney, Australia
Duration: Aug 6 2017Aug 11 2017

Publication series

Name34th International Conference on Machine Learning, ICML 2017
Volume7

Other

Other34th International Conference on Machine Learning, ICML 2017
Country/TerritoryAustralia
CitySydney
Period8/6/178/11/17

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Neural networks and rational functions'. Together they form a unique fingerprint.

Cite this