On the number of linear regions of deep neural networks

Guido Montúfar, Razvan Pascanu, Kyunghyun Cho, Yoshua Bengio

Research output: Contribution to journalConference articlepeer-review

Abstract

We study the complexity of functions computable by deep feedforward neural networks with piecewise linear activations in terms of the symmetries and the number of linear regions that they have. Deep networks are able to sequentially map portions of each layer's input-space to the same output. In this way, deep models compute functions that react equally to complicated patterns of different inputs. The compositional structure of these functions enables them to re-use pieces of computation exponentially often in terms of the network's depth. This paper investigates the complexity of such compositional maps and contributes new theoretical results regarding the advantage of depth for neural networks with piecewise linear activation functions. In particular, our analysis is not specific to a single family of models, and as an example, we employ it for rectifier and maxout networks. We improve complexity bounds from pre-existing work and investigate the behavior of units in higher layer.

Original languageEnglish (US)
Pages (from-to)2924-2932
Number of pages9
JournalAdvances in Neural Information Processing Systems
Volume4
Issue numberJanuary
StatePublished - 2014
Event28th Annual Conference on Neural Information Processing Systems 2014, NIPS 2014 - Montreal, Canada
Duration: Dec 8 2014Dec 13 2014

Keywords

  • Deep learning
  • Input space partition
  • Maxout
  • Neural network
  • Rectifier

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint Dive into the research topics of 'On the number of linear regions of deep neural networks'. Together they form a unique fingerprint.

Cite this