Memory capacity in neural network models: Rigorous lower bounds

Charles M. Newman

Research output: Contribution to journalArticlepeer-review

Abstract

We consider certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction δ of bit errors allowed. Rigorous proofs of the following are presented both for l = 2 and l ≥ 3: 1. 1. m can grow as fast as αNl-1. 2. 2. α can be as large as Bl/ln(1/δ) as δ → 0. 3. 3. Retrieved memories overlapping with several initial patterns persist for (very) small α. These phenomena were previously supported by numerical simulations or nonrigorous calculations. The quantity (l!) · α represents the number of stored bits per distinct synapse. The constant (l!) · Bl is determined explicitly; it decreases monotonically with l and tends to zero exponentially fast as l → ∞. We obtain rigorous lower bounds for the threshold value (l!) · αc (the maximum possible value of (l!) · α with δ unconstrained): 0.11 for l = 2 (compared to the actual value between 0.28 and 0.30 as estimated by Hopfield and by Amit, Gutfreund, and Sompolinsky), 0.22 for l = 3 and 0.16 for l = 4; as l → ∞, the bound tends to zero as fast as (l!) · Bl.

Original languageEnglish (US)
Pages (from-to)223-238
Number of pages16
JournalNeural Networks
Volume1
Issue number3
DOIs
StatePublished - 1988

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Memory capacity in neural network models: Rigorous lower bounds'. Together they form a unique fingerprint.

Cite this