Abstract
Optimal estimation and inference for both the minimizer and minimum of a convex regression function under the white noise and nonparametric regression models are studied in a nonasymptotic local minimax framework, where the performance of a procedure is evaluated at individual functions. Fully adaptive and computationally efficient algorithms are proposed and sharp minimax lower bounds are given for both the estimation accuracy and expected length of confidence intervals for the minimizer and minimum. The nonasymptotic local minimax framework brings out new phenomena in simultaneous estimation and inference for the minimizer and minimum. We establish a novel uncertainty principle that provides a fundamental limit on how well the minimizer and minimum can be estimated simultaneously for any convex regression function. A similar result holds for the expected length of the confidence intervals for the minimizer and minimum.
| Original language | English |
|---|---|
| Pages (from-to) | 392-411 |
| Number of pages | 20 |
| Journal | Annals of Statistics |
| Volume | 52 |
| Issue number | 1 |
| DOIs | |
| State | Published - Feb 2024 |
Keywords
- Adaptivity
- confidence interval
- minimax optimality
- modulus of continuity
- nonparametric regression
- uncertainty principle
- white noise model