TY - JOUR
T1 - Minimax optimal testing via classification
AU - Gerber, Patrik Róbert
AU - Han, Yanjun
AU - Polyanskiy, Yury
N1 - Funding Information:
YH was generously supported by the Norbert Wiener postdoctoral fellowship in statistics at MIT IDSS. YP was supported in part by the National Science Foundation under Grant No CCF-2131115. Research was sponsored by the United States Air Force Research Laboratory and the Department of the Air Force Artificial Intelligence Accelerator and was accomplished under Cooperative Agreement Number FA8750-19-2-1000. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Department of the Air Force or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation herein.
Publisher Copyright:
© 2023 P.R. Gerber, Y. Han & Y. Polyanskiy.
PY - 2023
Y1 - 2023
N2 - This paper considers an ML inspired approach to hypothesis testing known as classifier/classification-accuracy testing (CAT). In CAT, one first trains a classifier by feeding it labeled synthetic samples generated by the null and alternative distributions, which is then used to predict labels of the actual data samples. This method is widely used in practice when the null and alternative are only specified via simulators (as in many scientific experiments). We study goodness-of-fit, two-sample (TS) and likelihood-free hypothesis testing (LFHT), and show that CAT achieves (near-)minimax optimal sample complexity in both the dependence on the total-variation (TV) separation ϵ and the probability of error δ in a variety of non-parametric settings, including discrete distributions, d-dimensional distributions with a smooth density, and the Gaussian sequence model. In particular, we close the high probability sample complexity of LFHT for each class. As another highlight, we recover the minimax optimal complexity of TS over discrete distributions, which was recently established by Diakonikolas et al. (2021). The corresponding CAT simply compares empirical frequencies in the first half of the data, and rejects the null when the classification accuracy on the second half is better than random.
AB - This paper considers an ML inspired approach to hypothesis testing known as classifier/classification-accuracy testing (CAT). In CAT, one first trains a classifier by feeding it labeled synthetic samples generated by the null and alternative distributions, which is then used to predict labels of the actual data samples. This method is widely used in practice when the null and alternative are only specified via simulators (as in many scientific experiments). We study goodness-of-fit, two-sample (TS) and likelihood-free hypothesis testing (LFHT), and show that CAT achieves (near-)minimax optimal sample complexity in both the dependence on the total-variation (TV) separation ϵ and the probability of error δ in a variety of non-parametric settings, including discrete distributions, d-dimensional distributions with a smooth density, and the Gaussian sequence model. In particular, we close the high probability sample complexity of LFHT for each class. As another highlight, we recover the minimax optimal complexity of TS over discrete distributions, which was recently established by Diakonikolas et al. (2021). The corresponding CAT simply compares empirical frequencies in the first half of the data, and rejects the null when the classification accuracy on the second half is better than random.
KW - Classifier-accuracy testing
KW - Closeness testing
KW - Goodness-of-fit testing
KW - Identity testing
KW - Likelihood-free hypothesis testing
KW - Likelihood-free inference
KW - Scheffé’s test
KW - Two-sample testing
UR - http://www.scopus.com/inward/record.url?scp=85171587425&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85171587425&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85171587425
SN - 2640-3498
VL - 195
SP - 5395
EP - 5432
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
T2 - 36th Annual Conference on Learning Theory, COLT 2023
Y2 - 12 July 2023 through 15 July 2023
ER -