On overfitting and post-selection uncertainty assessments

  • L. Hong
  • , T. A. Kuffner
  • , R. Martin

    Research output: Contribution to journalArticlepeer-review

    18 Scopus citations

    Abstract

    In a regression context, when the relevant subset of explanatory variables is uncertain, it is common to use a data-driven model selection procedure. Classical linear model theory, applied naively to the selected submodel, may not be valid because it ignores the selected submodel's dependence on the data. We provide an explanation of this phenomenon, in terms of overfitting, for a class of model selection criteria.

    Original languageEnglish
    Pages (from-to)221-224
    Number of pages4
    JournalBiometrika
    Volume105
    Issue number1
    DOIs
    StatePublished - Mar 1 2018

    Keywords

    • Akaike information criterion
    • Bayesian information criterion
    • Model selection
    • Regression

    Fingerprint

    Dive into the research topics of 'On overfitting and post-selection uncertainty assessments'. Together they form a unique fingerprint.

    Cite this