Reply
Wed 10 Oct, 2012 04:10 am
Consider the following unconstrained optimization problem:
(NLP) min f(x)=2 x_1^2 + x_2^2+ x_1 x_3- 3 x_2+ x_(3 )
Perform one step of the gradient method starting from with exact line Search.
Consider that the application of the gradient method to the problem (NLP) starting at x0 converge to the stationary point x*. Is the solution of the problem (NLP)? Why? Which is the solution of the problem (NLP)?
Calculate the upper bound to the converge rate β of the gradient method applied to problem (NLP).