Generalized dual Sudakov minoration via dimension-reduction—a program
Volume 244 / 2019
Abstract
We propose a program for establishing a conjectural extension to the class of (origin-symmetric) log-concave probability measures $\mu$, of the classical dual Sudakov minoration on the expectation of the supremum of a Gaussian process: \begin{equation}\label{eq:abstract} M\Bigl(Z_p(\mu) , C \int \|{x}\|_K \,d\mu \cdot K\Bigr) \leq \exp(C p) \quad\ \forall p \geq 1 . \end{equation} Here $K$ is an origin-symmetric convex body, $Z_p(\mu)$ is the $L_p$-centroid body associated to $\mu$, $M(A,B)$ is the packing number of $B$ in $A$, and $C \gt 0$ is a universal constant. The program is to first establish a weak generalized dual Sudakov minoration, involving the dimension $n$ of the ambient space, which is then self-improved to a dimension-free estimate after applying a dimension-reduction step. The latter step may be thought of as a conjectural “small-ball one-sided” variant of the Johnson–Lindenstrauss dimension-reduction lemma. We establish the weak generalized dual Sudakov minoration for a variety of log-concave probability measures and convex bodies (for instance, this step is fully resolved assuming a positive answer to the slicing problem). The separation dimension-reduction step is fully established for ellipsoids and, up to logarithmic factors in the dimension, for cubes, resulting in a corresponding generalized (regular) dual Sudakov minoration estimate for these bodies and arbitrary log-concave measures, which are shown to be (essentially) best possible. Along the way, we establish a regular version of (0.1) for all $p \geq n$ and provide a new direct proof of Sudakov minoration via the program.