TY - JOUR
T1 - An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
AU - Lin, Qihang
AU - Xiao, Lin
N1 - Publisher Copyright:
© 2014, Springer Science+Business Media New York.
PY - 2015/4
Y1 - 2015/4
N2 - We consider optimization problems with an objective function that is the sum of two convex terms: one is smooth and given by a black-box oracle, and the other is general but with a simple, known structure. We first present an accelerated proximal gradient (APG) method for problems where the smooth part of the objective function is also strongly convex. This method incorporates an efficient line-search procedure, and achieves the optimal iteration complexity for such composite optimization problems. In case the strong convexity parameter is unknown, we also develop an adaptive scheme that can automatically estimate it on the fly, at the cost of a slightly worse iteration complexity. Then we focus on the special case of solving the ℓ 1-regularized least-squares problem in the high-dimensional setting. In such a context, the smooth part of the objective (least-squares) is not strongly convex over the entire domain. Nevertheless, we can exploit its restricted strong convexity over sparse vectors using the adaptive APG method combined with a homotopy continuation scheme. We show that such a combination leads to a global geometric rate of convergence, and the overall iteration complexity has a weaker dependency on the restricted condition number than previous work.
AB - We consider optimization problems with an objective function that is the sum of two convex terms: one is smooth and given by a black-box oracle, and the other is general but with a simple, known structure. We first present an accelerated proximal gradient (APG) method for problems where the smooth part of the objective function is also strongly convex. This method incorporates an efficient line-search procedure, and achieves the optimal iteration complexity for such composite optimization problems. In case the strong convexity parameter is unknown, we also develop an adaptive scheme that can automatically estimate it on the fly, at the cost of a slightly worse iteration complexity. Then we focus on the special case of solving the ℓ 1-regularized least-squares problem in the high-dimensional setting. In such a context, the smooth part of the objective (least-squares) is not strongly convex over the entire domain. Nevertheless, we can exploit its restricted strong convexity over sparse vectors using the adaptive APG method combined with a homotopy continuation scheme. We show that such a combination leads to a global geometric rate of convergence, and the overall iteration complexity has a weaker dependency on the restricted condition number than previous work.
KW - First-order method
KW - Homotopy continuation
KW - L1-regularized least-squares
KW - Proximal gradient method
KW - Sparse optimization
UR - http://www.scopus.com/inward/record.url?scp=84924987860&partnerID=8YFLogxK
U2 - 10.1007/s10589-014-9694-4
DO - 10.1007/s10589-014-9694-4
M3 - Article
AN - SCOPUS:84924987860
SN - 0926-6003
VL - 60
SP - 633
EP - 674
JO - Computational Optimization and Applications
JF - Computational Optimization and Applications
IS - 3
ER -