Eigenvalues of covariance matrices: Application to neural-network learning

Yann Le Cun, Ido Kanter, Sara A. Solla

Research output: Contribution to journalArticle

Abstract

The learing time of a simple neural-network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second-order properties of the objective function in the space of coupling coefficients. The results are generic for symmetric matrices obtained by summing outer products of random vectors. The form of the eigenvalue distribution suggests new techniques for accelerating the learning process, and provides a theoretical justification for the choice of centered versus biased state variables.

Original languageEnglish (US)
Pages (from-to)2396-2399
Number of pages4
JournalPhysical Review Letters
Volume66
Issue number18
DOIs
StatePublished - 1991

ASJC Scopus subject areas

  • Physics and Astronomy(all)

Fingerprint Dive into the research topics of 'Eigenvalues of covariance matrices: Application to neural-network learning'. Together they form a unique fingerprint.

Cite this