TY - JOUR
T1 - Bias Correction with Jackknife, Bootstrap, and Taylor Series
AU - Jiao, Jiantao
AU - Han, Yanjun
N1 - Funding Information:
Manuscript received July 17, 2019; accepted January 4, 2020. Date of publication January 27, 2020; date of current version June 18, 2020. The work of Jiantao Jiao was supported in part by the National Science Foundation (NSF) under Grants IIS-1901252 and CCF-1909499. Jiantao Jiao is with the Department of Electrical Engineering and Computer Sciences and the Department of Statistics, University of California, Berkeley, CA 94720 USA (e-mail: jiantao@eecs.berkeley.edu). Yanjun Han is with the Department of Electrical Engineering, Stanford University, Stanford, CA 94305 USA (e-mail: yjhan@stanford.edu). Communicated by N. Santhanam, Associate Editor for Source Coding. Color versions of one or more of the figures in this article are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/TIT.2020.2969439
Publisher Copyright:
© 1963-2012 IEEE.
PY - 2020/7
Y1 - 2020/7
N2 - We analyze bias correction methods using jackknife, bootstrap, and Taylor series. We focus on the binomial model, and consider the problem of bias correction for estimating f(p), where f\in C[{0,1}] is arbitrary. We characterize the supremum norm of the bias of general jackknife and bootstrap estimators for any continuous functions, and demonstrate the in delete-d jackknife, different values of d may lead to drastically different behaviors in jackknife. We show that in the binomial model, iterating the bootstrap bias correction infinitely many times may lead to divergence of bias and variance, and demonstrate that the bias properties of the bootstrap bias corrected estimator after r-1 rounds are of the same order as that of the r-jackknife estimator if a bounded coefficients condition is satisfied.
AB - We analyze bias correction methods using jackknife, bootstrap, and Taylor series. We focus on the binomial model, and consider the problem of bias correction for estimating f(p), where f\in C[{0,1}] is arbitrary. We characterize the supremum norm of the bias of general jackknife and bootstrap estimators for any continuous functions, and demonstrate the in delete-d jackknife, different values of d may lead to drastically different behaviors in jackknife. We show that in the binomial model, iterating the bootstrap bias correction infinitely many times may lead to divergence of bias and variance, and demonstrate that the bias properties of the bootstrap bias corrected estimator after r-1 rounds are of the same order as that of the r-jackknife estimator if a bounded coefficients condition is satisfied.
KW - Bootstrap
KW - approximation theory
KW - bias correction
KW - functional estimation
KW - jackknife
UR - http://www.scopus.com/inward/record.url?scp=85087176055&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85087176055&partnerID=8YFLogxK
U2 - 10.1109/TIT.2020.2969439
DO - 10.1109/TIT.2020.2969439
M3 - Article
AN - SCOPUS:85087176055
SN - 0018-9448
VL - 66
SP - 4392
EP - 4418
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 7
M1 - 8970278
ER -