On the convergence properties of the projected gradient method for convex optimization
AUTOR(ES)
Iusem, A. N.
FONTE
Computational & Applied Mathematics
DATA DE PUBLICAÇÃO
2003
RESUMO
When applied to an unconstrained minimization problem with a convex objective, the steepest descent method has stronger convergence properties than in the noncovex case: the whole sequence converges to an optimal solution under the only hypothesis of existence of minimizers (i.e. without assuming e.g. boundedness of the level sets). In this paper we look at the projected gradient method for constrained convex minimization. Convergence of the whole sequence to a minimizer assuming only existence of solutions has also been already established for the variant in which the stepsizes are exogenously given and square summable. In this paper, we prove the result for the more standard (and also more efficient) variant, namely the one in which the stepsizes are determined through an Armijo search.
Documentos Relacionados
- Spectral projected gradient method for the procrustes problem
- On the use of the Spectral Projected Gradient method for Support Vector Machines
- The global convergence of a descent PRP conjugate gradient method
- New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization
- Problemas de Otimização Quase Convexos: Método do Gradiente para Funções Escalares e Vetoriais