ADMM for Penalized Quantile Regression in Big Data

Liqun Yu, Nan Lin

Research output: Contribution to journalArticlepeer-review

31 Scopus citations

Abstract

Traditional linear programming algorithms for quantile regression, for example, the simplex method and the interior point method, work well for data of small to moderate sizes. However, these methods are difficult to generalize to high-dimensional big data for which penalization is usually necessary. Further, the massive size of contemporary big data calls for the development of large-scale algorithms on distributed computing platforms. The traditional linear programming algorithms are intrinsically sequential and not suitable for such frameworks. In this paper, we discuss how to use the popular ADMM algorithm to solve large-scale penalized quantile regression problems. The ADMM algorithm can be easily parallelized and implemented in modern distributed frameworks. Simulation results demonstrate that the ADMM is as accurate as traditional LP algorithms while faster even in the nonparallel case.

Original languageEnglish
Pages (from-to)494-518
Number of pages25
JournalInternational Statistical Review
Volume85
Issue number3
DOIs
StatePublished - Dec 2017

Keywords

  • ADMM
  • divide-and-conquer
  • Hadoop
  • large-scale
  • MapReduce
  • Penalized quantile regression

Fingerprint

Dive into the research topics of 'ADMM for Penalized Quantile Regression in Big Data'. Together they form a unique fingerprint.

Cite this