Transformation invariance in pattern recognition: Tangent distance and propagation

Patrice Y. Simard, Yann A. Le Cun, John S. Denker, Bernard Victorri

Research output: Contribution to journalArticlepeer-review

Abstract

In pattern recognition, statistical modeling, or regression, the amount of data is a critical factor affecting the performance. If the amount of data and computational resources are unlimited, even trivial algorithms will converge to the optimal solution. However, in the practical case, given limited data and other resources, satisfactory performance requires sophisticated methods to regularize the problem by introducing a priori knowledge. Invariance of the output with respect to certain transformations of the input is a typical example of such a priori knowledge. We introduce the concept of tangent vectors, which compactly represent the essence of these transformation invariances, and two classes of algorithms, tangent distance and tangent propagation, which make use of these invariances to improve performance.

Original languageEnglish (US)
Pages (from-to)181-197
Number of pages17
JournalInternational Journal of Imaging Systems and Technology
Volume11
Issue number3
DOIs
StatePublished - 2000

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Software
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Transformation invariance in pattern recognition: Tangent distance and propagation'. Together they form a unique fingerprint.

Cite this