Abstract
We show through case studies that it is easier to estimate the fundamental limits of data processing than to construct the explicit algorithms to achieve those limits. Focusing on binary classification, data compression, and prediction under logarithmic loss, we show that in the finite space setting, when it is possible to construct an estimator of the limits with vanishing error with n samples, it may require at least n\ln n samples to construct an explicit algorithm to achieve the limits.
Original language | English (US) |
---|---|
Article number | 8758354 |
Pages (from-to) | 6704-6715 |
Number of pages | 12 |
Journal | IEEE Transactions on Information Theory |
Volume | 65 |
Issue number | 10 |
DOIs | |
State | Published - Oct 2019 |
Keywords
- Bayes envelope estimation
- entropy estimation
- generalized entropy
- prediction under logarithmic loss
- total variation distance estimation
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences