Subgradient methods

Subgradient method

Subgradient methods are algorithms for solving convex optimization problems. Originally developed by Naum Z. Shor and others in the 1960s and 1970s, subgradient methods can be used with a non-differentiable objective function. When the objective function is differentiable, subgradient methods for unconstrained problems use the same search direction as the method of steepest descent.

Although subgradient methods can be much slower than interior-point methods and Newton's method in practice, they can be immediately applied to a far wider variety of problems and require much less memory. Moreover, by combining the subgradient method with primal or dual decomposition techniques, it is sometimes possible to develop a simple distributed algorithm for a problem.

Basic subgradient update

Let f:mathbb{R}^n to mathbb{R} be a convex function with domain mathbb{R}^n. The subgradient method uses the iteration

x^{(k+1)} = x^{(k)} - alpha_k g^{(k)}
where g^{(k)} denotes a subgradient of f at x^{(k)} . If f is differentiable, its only subgradient is the gradient vector nabla f itself. It may happen that -g^{(k)} is not a descent direction for f at x^{(k)}. We therefore maintain a list f_{rm{best}} that keeps track of the lowest objective function value found so far, i.e.
f_{rm{best}}^{(k)} = min{f_{rm{best}}^{(k-1)} , f(x^{(k)}) }.

Step size rules

Many different types of step size rules are used in the subgradient method. Five basic step size rules for which convergence is guaranteed are:

  • Constant step size, alpha_k = alpha.
  • Constant step length, alpha_k = gamma/lVert g^{(k)} rVert_2, which gives lVert x^{(k+1)} - x^{(k)} rVert_2 = gamma.
  • Square summable but not summable step size, i.e. any step sizes satisfying

alpha_kgeq0,qquadsum_{k=1}^infty alpha_k^2 < infty,qquad sum_{k=1}^infty alpha_k = infty.

  • Nonsummable diminishing, i.e. any step sizes satisfying

alpha_kgeq0,qquad lim_{ktoinfty} alpha_k = 0,qquad sum_{k=1}^infty alpha_k = infty.

  • Nonsummable diminishing step lengths, i.e. alpha_k = gamma_k/lVert g^{(k)} rVert_2, where

gamma_kgeq0,qquad lim_{ktoinfty} gamma_k = 0,qquad sum_{k=1}^infty gamma_k = infty.
Notice that the step sizes listed above are determined before the algorithm is run and do not depend on any data computed during the algorithm. This is very different from the step size rules found in standard descent methods, which depend on the current point and search direction.

Convergence results

For constant step size and constant step length, the subgradient algorithm is guaranteed to converge to within some range of the optimal value, i.e.,

lim_{ktoinfty} f_{rm{best}}^{(k)} - f^*

Constrained optimization

Projected subgradient

One extension of the subgradient method is the projected subgradient method, which solves the constrained optimization problem
minimize f(x) subject to
xinmathcal{C}

where mathcal{C} is a convex set. The projected subgradient method uses the iteration

x^{(k+1)} = P left(x^{(k)} - alpha_k g^{(k)} right)

where P is projection on mathcal{C} and g^{(k)} is any subgradient of f at x^{(k)}.

General constraints

The subgradient method can be extended to solve the inequality constrained problem

minimize f_0(x) subject to
f_i (x) leq 0,quad i = 1,dots,m

where f_i are convex. The algorithm takes the same form as the unconstrained case

x^{(k+1)} = x^{(k)} - alpha_k g^{(k)}

where alpha_k>0 is a step size, and g^{(k)} is a subgradient of the objective or one of the constraint functions at x. Take

g^{(k)} =
begin{cases}
 partial f_0 (x)  & f_i(x) leq 0,quad i = 1,dots,m 
 partial f_j (x)  & f_j(x) > 0
end{cases}

where partial f denotes the subdifferential of f . If the current point is feasible, the algorithm uses an objective subgradient; if the current point is infeasible, the algorithm chooses a subgradient of any violated constraint.

References

*

*

External links

Search another word or see Subgradient methodson Dictionary | Thesaurus |Spanish
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature