RU Phyton Performance of The Steepest Descent Method Questions
Description
The Rosenbrock function is defined as:
f (x) = 10(x2 ?x1^2)^2 + (1 ?x1)^2
The minimizer of this function is the point x = [1, 1]^t.
Test (in Python) the performance of the gradient descent algorithm
for the Rosenbrookænbsp;function starting at the point [?1.2, 1]^t by finding the number
of iterations till convergence to a gradient norm of 10^?5.
2. Assess the performance of the steepest descent method on the above problem by
generating a semi_log plot of the error f (x^k) ?f (x^?) vs. iteration count. Comment
on the convergence rate.
Unformatted Attachment Preview
function [f,Df]=rosen(x)
% the 2D Rosenbrock function and its gradient
[m,n]=size(x);
if (m ~= 2 | n ~= 1)
error(‘Bad data sent to 2D Rosenbrock function’)
end
z
= [x(1)^2 – x(2), x(1)-1]’;
f
= 100*z(1)^2 + z(2)^2;
Df
= [400*x(1)*z(1) + 2*z(2), -200*z(1)]’;
Surface Plot of the Rosenbrock function
[X,Y]=meshgrid(-2:0.1:2);
Z=100*(Y-X.^2).^2+(ones(size(X))-X).^2;
surf(X,Y,Z)
Contour Plot of the Rosenbrock function
[X,Y]=meshgrid(-2:0.1:2);
Z=100*(Y-X.^2).^2+(ones(size(X))-X).^2;
contour(X,Y,Z)
Minimize Rosenbrock by Steepest Descent
minRosenBySD.m
%In this script we apply steepest descent with the
%backtracking linesearch to minimize the 2-D
%Rosenbrock function starting at the point x=(-1.9,2).
%Termination parameters
eps
= 1.0e-4;
epsf
= 1.0e-6;
maxit
= 10000;
iter
= 0;
%Linesearch parameters for backtracking
gamma
= 0.5;
c
= 0.01;
%Initialization
xc
= [-1.9;2];
fnc
= ‘rosenbrock’;
[fc,Df] = feval(fnc,xc);
nDf
= norm(Df);
%Are we already at a solution, or should we continue.
if nDf epsf & ndiff > eps & iter < maxit,
d
= -Df/nDf;
DDfnc
= Df’*d;
[xn,fn,fcall]=backtrack(xc,d,fc,fnc,DDfnc,c,gamma,eps);
ndiff
= norm(xn-xc);
xc
= xn;
[fc,Df] = feval(fnc,xc);
nDf
= norm(Df);
iter
= iter + 1;
data
= [data;[iter,nDf,ndiff,fcall,fc]];
%Report reason for termination.
if nDf
Purchase answer to see full
attachment
Have a similar assignment? "Place an order for your assignment and have exceptional work written by our team of experts, guaranteeing you A results."