### Abstract

We consider certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction δ of bit errors allowed. Rigorous proofs of the following are presented both for l = 2 and l ≥ 3: 1. 1. m can grow as fast as αN^{l-1}. 2. 2. α can be as large as B_{l}/ln(1/δ) as δ → 0. 3. 3. Retrieved memories overlapping with several initial patterns persist for (very) small α. These phenomena were previously supported by numerical simulations or nonrigorous calculations. The quantity (l!) · α represents the number of stored bits per distinct synapse. The constant (l!) · B_{l} is determined explicitly; it decreases monotonically with l and tends to zero exponentially fast as l → ∞. We obtain rigorous lower bounds for the threshold value (l!) · α_{c} (the maximum possible value of (l!) · α with δ unconstrained): 0.11 for l = 2 (compared to the actual value between 0.28 and 0.30 as estimated by Hopfield and by Amit, Gutfreund, and Sompolinsky), 0.22 for l = 3 and 0.16 for l = 4; as l → ∞, the bound tends to zero as fast as (l!) · B_{l}.

Original language | English (US) |
---|---|

Pages (from-to) | 223-238 |

Number of pages | 16 |

Journal | Neural Networks |

Volume | 1 |

Issue number | 3 |

DOIs | |

State | Published - 1988 |

### Fingerprint

### ASJC Scopus subject areas

- Cognitive Neuroscience
- Artificial Intelligence