0
   

optimization

 
 
Reply Wed 10 Oct, 2012 04:10 am
Consider the following unconstrained optimization problem:

(NLP) min f(x)=2 x_1^2 + x_2^2+ x_1 x_3- 3 x_2+ x_(3 )

Perform one step of the gradient method starting from with exact line Search.

Consider that the application of the gradient method to the problem (NLP) starting at x0 converge to the stationary point x*. Is the solution of the problem (NLP)? Why? Which is the solution of the problem (NLP)?

Calculate the upper bound to the converge rate β of the gradient method applied to problem (NLP).
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Question • Score: 0 • Views: 670 • Replies: 0
No top replies

 
 

Related Topics

Amount of Time - Question by Randy Dandy
logical number sequence riddle - Question by feather
Calc help needed - Question by mjborowsky
HELP! The Product and Quotient Rules - Question by charsha
STRAIGHT LINES - Question by iqrasarguru
Possible Proof of the ABC Conjecture - Discussion by oralloy
Help with a simple math problem? - Question by Anonymous1234567890
How do I do this on a ti 84 calculator? - Question by Anonymous1234567890
 
  1. Forums
  2. » optimization
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.06 seconds on 12/23/2024 at 08:07:22