TY - GEN
T1 - Online Discrepancy with Recourse for Vectors and Graphs
AU - Gupta, Anupam
AU - Gurunathan, Vijaykrishna
AU - Krishnaswamy, Ravishankar
AU - Kumar, Amit
AU - Singla, Sahil
N1 - Publisher Copyright:
Copyright c 2022 by SIAM Unauthorized reproduction of this article is prohibited.
PY - 2022
Y1 - 2022
N2 - The vector-balancing problem is a fundamental problem in discrepancy theory: given T vectors in [1; 1]n, find a signing (a) 2 f1g of each vector a to minimize the discrepancy k P a (a) ak1. This problem has been extensively studied in the static/ofiine setting. In this paper we initiate its study in the fully-dynamic setting with recourse: the algorithm sees a stream of T insertions and deletions of vectors, and at each time must maintain a low-discrepancy signing, while also minimizing the amortized recourse (the number of times any vector changes its sign) per update. For general vectors, we show algorithms which almost match Spencer's O( p n) offine discrepancy bound, with O(n polylog T) amortized recourse per update. The crucial idea behind our algorithm is to compute a basic feasible solution to the linear relaxation in a distributed and recursive manner, which helps find a low-discrepancy signing. We bound the recourse using the distributed computation of the basic solution, and argue that only a small part of the instance needs to be re-computed at each update. Since vector balancing has also been greatly studied for sparse vectors, we then give algorithms for low- discrepancy edge orientation, where we dynamically maintain signings for 2-sparse vectors in an n-dimensional space. Alternatively, this can be seen as orienting a dynamic set of edges of an n-vertex graph to minimize the discrepancy, i.e., the absolute difference between in- and out-degrees at any vertex. We present a deterministic algorithm with O(polylog n) discrepancy and O(polylog n) amortized recourse. The core ideas are to dynamically maintain an expander-decomposition with low recourse (using a very simple approach), and then to show that, as the expanders change over time, a natural local-search algorithm converges quickly (i.e., with low recourse) to a low-discrepancy solution. We also give strong lower bounds (with some matching upper bounds) for local-search discrepancy minimization algorithms for vector balancing and edge orientation.
AB - The vector-balancing problem is a fundamental problem in discrepancy theory: given T vectors in [1; 1]n, find a signing (a) 2 f1g of each vector a to minimize the discrepancy k P a (a) ak1. This problem has been extensively studied in the static/ofiine setting. In this paper we initiate its study in the fully-dynamic setting with recourse: the algorithm sees a stream of T insertions and deletions of vectors, and at each time must maintain a low-discrepancy signing, while also minimizing the amortized recourse (the number of times any vector changes its sign) per update. For general vectors, we show algorithms which almost match Spencer's O( p n) offine discrepancy bound, with O(n polylog T) amortized recourse per update. The crucial idea behind our algorithm is to compute a basic feasible solution to the linear relaxation in a distributed and recursive manner, which helps find a low-discrepancy signing. We bound the recourse using the distributed computation of the basic solution, and argue that only a small part of the instance needs to be re-computed at each update. Since vector balancing has also been greatly studied for sparse vectors, we then give algorithms for low- discrepancy edge orientation, where we dynamically maintain signings for 2-sparse vectors in an n-dimensional space. Alternatively, this can be seen as orienting a dynamic set of edges of an n-vertex graph to minimize the discrepancy, i.e., the absolute difference between in- and out-degrees at any vertex. We present a deterministic algorithm with O(polylog n) discrepancy and O(polylog n) amortized recourse. The core ideas are to dynamically maintain an expander-decomposition with low recourse (using a very simple approach), and then to show that, as the expanders change over time, a natural local-search algorithm converges quickly (i.e., with low recourse) to a low-discrepancy solution. We also give strong lower bounds (with some matching upper bounds) for local-search discrepancy minimization algorithms for vector balancing and edge orientation.
UR - http://www.scopus.com/inward/record.url?scp=85130743554&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85130743554&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85130743554
T3 - Proceedings of the Annual ACM-SIAM Symposium on Discrete Algorithms
SP - 1356
EP - 1383
BT - ACM-SIAM Symposium on Discrete Algorithms, SODA 2022
PB - Association for Computing Machinery
T2 - 33rd Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2022
Y2 - 9 January 2022 through 12 January 2022
ER -