Soft Matching Distance: A metric on neural representations that captures single-neuron tuning

Meenakshi Khosla, Alex H. Williams

Research output: Contribution to journalConference articlepeer-review

Abstract

Common measures of neural representational (dis)similarity are designed to be insensitive to rotations and reflections of the neural activation space. Motivated by the premise that the tuning of individual units may be important, there has been recent interest in developing stricter notions of representational (dis)similarity that require neurons to be individually matched across networks. When two networks have the same size (i.e. same number of neurons), a distance metric can be formulated by optimizing over neuron index permutations to maximize tuning curve alignment. However, it is not clear how to generalize this metric to measure distances between networks with different sizes. Here, we leverage a connection to optimal transport theory to derive a natural generalization based on “soft” permutations. The resulting metric is symmetric, satisfies the triangle inequality, and can be interpreted as a Wasserstein distance between two empirical distributions. Further, our proposed metric avoids counter-intuitive outcomes suffered by alternative approaches, and captures complementary geometric insights into neural representations that are entirely missed by rotation-invariant metrics.

Original languageEnglish (US)
Pages (from-to)133-143
Number of pages11
JournalProceedings of Machine Learning Research
Volume243
StatePublished - 2023
Event1st Workshop on Unifying Representations in Neural Models, UniReps 2023 - New Orleans, United States
Duration: Dec 15 2023 → …

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Soft Matching Distance: A metric on neural representations that captures single-neuron tuning'. Together they form a unique fingerprint.

Cite this