Random sections of ellipsoids and the power of random information

We study the circumradius of the intersection of an m-dimensional ellipsoid \mathcal E with semi-axes \sigma _1\geq \dots \geq \sigma _m with random subspaces of codimension n, where n can be much smaller than m. We find that, under certain assumptions on \sigma, this random radius \mathcal {R}_n=\m...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Transactions of the American Mathematical Society 2021-12, Vol.374 (12), p.8691-8713
Hauptverfasser: Hinrichs, Aicke, Krieg, David, Novak, Erich, Prochno, Joscha, Ullrich, Mario
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We study the circumradius of the intersection of an m-dimensional ellipsoid \mathcal E with semi-axes \sigma _1\geq \dots \geq \sigma _m with random subspaces of codimension n, where n can be much smaller than m. We find that, under certain assumptions on \sigma, this random radius \mathcal {R}_n=\mathcal {R}_n(\sigma ) is of the same order as the minimal such radius \sigma _{n+1} with high probability. In other situations \mathcal {R}_n is close to the maximum \sigma _1. The random variable \mathcal {R}_n naturally corresponds to the worst-case error of the best algorithm based on random information for L_2-approximation of functions from a compactly embedded Hilbert space H with unit ball \mathcal E. In particular, \sigma _k is the kth largest singular value of the embedding H\hookrightarrow L_2. In this formulation, one can also consider the case m=\infty and we prove that random information behaves very differently depending on whether \sigma \in \ell _2 or not. For \sigma \notin \ell _2 we get \mathbb {E}[\mathcal {R}_n] = \sigma _1 and random information is completely useless. For \sigma \in \ell _2 the expected radius tends to zero at least at rate o(1/\sqrt {n}) as n\to \infty. In the important case \[ \sigma _k \asymp k^{-\alpha } \ln ^{-\beta }(k+1), \] where \alpha > 0 and \beta \in \mathbb {R} (which corresponds to various Sobolev embeddings), we prove \begin{equation*} \mathbb E [\mathcal {R}_n(\sigma )] \asymp \left \{\begin {array}{cl} \sigma _1 & \text {if} \quad \alpha \alpha =1/2, {2mm} \\ \sigma _{n+1} & \text {if} \quad \alpha >1/2. \end{array}\right . \end{equation*} In the proofs we use a comparison result for Gaussian processes à la Gordon, exponential estimates for sums of chi-squared random variables, and estimates for the extreme singular values of (structured) Gaussian random matrices. The upper bound is constructive. It is proven for the worst case error of a least squares estimator.
ISSN:0002-9947
1088-6850
DOI:10.1090/tran/8502