Skip to content Skip to navigation
Seminar
11/20/2019 11:00 am
CoRE 431

Online Vector Balancing and Geometric Discrepancy

Sahil Singla, Princeton/IAS

Organizer(s): Rutgers/DIMACS Theory of Computing Seminar

Abstract

We consider an online vector balancing question where T vectors, chosen from an arbitrary distribution over [-1,1]^n, arrive one-by-one and must be immediately given a {+, -} sign. The goal is to keep the discrepancy---the $\ell_{\infty}$-norm of any signed prefix-sum---as small as possible. A concrete example of this question is the online interval discrepancy problem where T points are sampled one-by-one uniformly in the unit interval [0,1], and the goal is to immediately color them {+,-} such that every sub-interval remains always nearly balanced. As random coloring incurs $\Omega(T^{1/2})$ discrepancy, while the offline bounds are $\Theta((n\log T)^{1/2})$ for vector balancing and $1$ for interval balancing, a natural question is whether one can (nearly) match the offline bounds in the online setting for these problems. One must utilize the stochasticity as in the worst-case scenario it is known that discrepancy is $\Omega(T^{1/2})$ for any online algorithm.

In a special case of online vector balancing, Bansal and Spencer recently show an $O(\sqrt{n}\log T)$ bound when each coordinate is independently chosen. When there are dependencies among the coordinates, as in the interval discrepancy problem, the problem becomes much more challenging. In this talk, we will introduce a new framework that allows us to handle online vector balancing even when the input distribution has dependencies across coordinates. In particular, this lets us obtain a $\poly(n, \log T)$ bound for online vector balancing under arbitrary input distributions, and a $\polylog (T)$ bound for online interval discrepancy. Our framework is powerful enough to capture other well-studied geometric discrepancy problems; e.g., we obtain a $\poly(\log^d (T))$ bound for the online d-dimensional Tusnady's problem. All our bounds are tight up to polynomial factors. A key new technical ingredient in our work is an anti-concentration inequality for sums of pairwise uncorrelated random variables, which might also be of independent interest.

Based on two joint works: first with Haotian Jiang and Janardhan Kulkarni, and second with Nikhil Bansal, Haotian Jiang, and Makrand Sinha.