최적화이론 ( Optimization Theory ) HW1
- 최초 등록일
- 2010.12.22
- 최종 저작일
- 2010.04
- 4페이지/ 한컴오피스
- 가격 3,000원
소개글
line search program
Steepest descent algorithm using the secant method for the line search
Steepest descent algorithm using the secant method for line search to Rosenbrock`s function
the conjugate gradient algorithm for general functions using the secant method for the line search for Rosenbrock`s function.
컴파일 실행환경
없음
본문내용
※ 공통으로 적용된 secant method를 이용한 line search program
%linesearch_secant
init_s = 0;
init_l = 0.00001;
Epsilon = 1e-6;
f_s = d`*Grad( x0 + init_s * d );
f_l = d`*Grad( x0 + init_l * d );
n=1;
while 1
new = init_l - (( init_l - init_s ) / ( f_l - f_s )) * f_l;
f_s = f_l;
init_s = init_l;
init_l = new;
f_l = d`*Grad(x0 + new * d);
if sum(abs(f_s)) <= Epsilon * sum(abs( d`*Grad(x0) ))
break
end
n = n + 1;
end
alpha = init_l;
참고 자료
E.K.P. Chong and S.H. Żak, An Introduction to Optimization, Wiley, 2008.