An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization

Qihang Lin, Lin Xiao

Research output: Contribution to journalArticlepeer-review

31 Scopus citations


We consider optimization problems with an objective function that is the sum of two convex terms: one is smooth and given by a black-box oracle, and the other is general but with a simple, known structure. We first present an accelerated proximal gradient (APG) method for problems where the smooth part of the objective function is also strongly convex. This method incorporates an efficient line-search procedure, and achieves the optimal iteration complexity for such composite optimization problems. In case the strong convexity parameter is unknown, we also develop an adaptive scheme that can automatically estimate it on the fly, at the cost of a slightly worse iteration complexity. Then we focus on the special case of solving the ℓ 1-regularized least-squares problem in the high-dimensional setting. In such a context, the smooth part of the objective (least-squares) is not strongly convex over the entire domain. Nevertheless, we can exploit its restricted strong convexity over sparse vectors using the adaptive APG method combined with a homotopy continuation scheme. We show that such a combination leads to a global geometric rate of convergence, and the overall iteration complexity has a weaker dependency on the restricted condition number than previous work.

Original languageEnglish
Pages (from-to)633-674
Number of pages42
JournalComputational Optimization and Applications
Issue number3
StatePublished - Apr 2015
Externally publishedYes


  • First-order method
  • Homotopy continuation
  • L1-regularized least-squares
  • Proximal gradient method
  • Sparse optimization


Dive into the research topics of 'An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization'. Together they form a unique fingerprint.

Cite this