Bayesian regularized quantile regression

  • Qing Li
  • , Ruibin Xiy
  • , Nan Lin

Research output: Contribution to journalArticlepeer-review

135 Scopus citations

Abstract

Regularization, e.g. lasso, has been shown to be effective in quantile regression in improving the prediction accuracy (Li and Zhu 2008; Wu and Liu 2009). This paper studies regularization in quantile regressions from a Bayesian perspective. By proposing a hierarchical model framework, we give a generic treatment to a set of regularization approaches, including lasso, group lasso and elastic net penalties. Gibbs samplers are derived for all cases. This is the first work to discuss regularized quantile regression with the group lasso penalty and the elastic net penalty. Both simulated and real data examples show that Bayesian regularized quantile regression methods often outperform quantile regression without regularization and their non-Bayesian counterparts with regularization.

Original languageEnglish
Pages (from-to)533-556
Number of pages24
JournalBayesian Analysis
Volume5
Issue number3
DOIs
StatePublished - 2010

Keywords

  • Bayesian analysis
  • Elastic net
  • Gibbs sampler
  • Group lasso
  • Lasso
  • Quantile regression
  • Regularization

Fingerprint

Dive into the research topics of 'Bayesian regularized quantile regression'. Together they form a unique fingerprint.

Cite this