Cross Entropy versus Label Smoothing: A Neural Collapse Perspective

Li Guo, George Andriopoulos, Zifan Zhao, Zixuan Dong, Shuyang Ling, Keith Ross

Research output: Contribution to journalArticlepeer-review

Abstract

Label smoothing is a widely adopted technique to mitigate overfitting in deep neural networks. This paper studies label smoothing from the perspective of Neural Collapse (NC), a powerful empirical and theoretical framework which characterizes model behavior during the terminal phase of training. We first show empirically that models trained with label smoothing converge faster to neural collapse solutions and attain a stronger level of neural collapse compared to those trained with cross-entropy loss. Furthermore, we show that at the same level of NC1, models under label smoothing loss exhibit intensified NC2. These findings provide valuable insights into the impact of label smoothing on model performance and calibration. Then, leveraging the unconstrained feature model, we derive closed-form solutions for the global minimizers under both label smoothing and cross-entropy losses. We show that models trained with label smoothing have a lower conditioning number and, therefore, theoretically converge faster. Our study, combining empirical evidence and theoretical results, not only provides nuanced insights into the differences between label smoothing and cross-entropy losses, but also serves as an example of how the powerful neural collapse framework can be used to improve our understanding of DNNs.

Original languageEnglish (US)
JournalTransactions on Machine Learning Research
Volume2025-May
StatePublished - 2025

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Cross Entropy versus Label Smoothing: A Neural Collapse Perspective'. Together they form a unique fingerprint.

Cite this