A proximal-gradient homotopy method for the ℓ 1-regularized least-squares problem

Lin Xiao, Tong Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

14 Scopus citations

Abstract

We consider the ℓ 1-regularized least-squares problem for sparse recovery and compressed sensing. Since the objective function is not strongly convex, standard proximal gradient methods only achieve sublinear convergence. We propose a homotopy continuation strategy, which employs a proximal gradient method to solve the problem with a sequence of decreasing regularization parameters. It is shown that under common assumptions in compressed sensing, the proposed method ensures that all iterates along the homotopy solution path are sparse, and the objective function is effectively strongly convex along the solution path. This observation allows us to obtain a global geometric convergence rate for the procedure. Empirical results are presented to support our theoretical analysis.

Original languageEnglish
Title of host publicationProceedings of the 29th International Conference on Machine Learning, ICML 2012
Pages839-846
Number of pages8
StatePublished - 2012
Externally publishedYes
Event29th International Conference on Machine Learning, ICML 2012 - Edinburgh, United Kingdom
Duration: Jun 26 2012Jul 1 2012

Publication series

NameProceedings of the 29th International Conference on Machine Learning, ICML 2012
Volume1

Conference

Conference29th International Conference on Machine Learning, ICML 2012
Country/TerritoryUnited Kingdom
CityEdinburgh
Period06/26/1207/1/12

Fingerprint

Dive into the research topics of 'A proximal-gradient homotopy method for the ℓ 1-regularized least-squares problem'. Together they form a unique fingerprint.

Cite this