ESTIMATION AND INFERENCE FOR MINIMIZER AND MINIMUM OF CONVEX FUNCTIONS: OPTIMALITY, ADAPTIVITY AND UNCERTAINTY PRINCIPLES

  • T. Tony Cai
  • , Ran Chen
  • , Yuancheng Zhu

    Research output: Contribution to journalArticlepeer-review

    1 Scopus citations

    Abstract

    Optimal estimation and inference for both the minimizer and minimum of a convex regression function under the white noise and nonparametric regression models are studied in a nonasymptotic local minimax framework, where the performance of a procedure is evaluated at individual functions. Fully adaptive and computationally efficient algorithms are proposed and sharp minimax lower bounds are given for both the estimation accuracy and expected length of confidence intervals for the minimizer and minimum. The nonasymptotic local minimax framework brings out new phenomena in simultaneous estimation and inference for the minimizer and minimum. We establish a novel uncertainty principle that provides a fundamental limit on how well the minimizer and minimum can be estimated simultaneously for any convex regression function. A similar result holds for the expected length of the confidence intervals for the minimizer and minimum.

    Original languageEnglish
    Pages (from-to)392-411
    Number of pages20
    JournalAnnals of Statistics
    Volume52
    Issue number1
    DOIs
    StatePublished - Feb 2024

    Keywords

    • Adaptivity
    • confidence interval
    • minimax optimality
    • modulus of continuity
    • nonparametric regression
    • uncertainty principle
    • white noise model

    Fingerprint

    Dive into the research topics of 'ESTIMATION AND INFERENCE FOR MINIMIZER AND MINIMUM OF CONVEX FUNCTIONS: OPTIMALITY, ADAPTIVITY AND UNCERTAINTY PRINCIPLES'. Together they form a unique fingerprint.

    Cite this