Starting from:

$30

MATH 340 LAB 3 Assignment

MATH 340 
LAB 3 Assignment

Fixed Point Iteration
A general fixed point iteration is a recursive scheme such as:
xn+1 = g(xn) , n ≥ 0 .
This is not guaranteed to converge to a fixed point r, s. t. r = g(r), unless some
assumptions are satisfied.
You can visualize this through a “cobweb” graph of all iterations, drawing the
lines joining the couples of points (xn, xn) [a point on the bisector line y = x]
and (xn, g(xn)). See the figure (1) for an example of just a few iterations and
corresponding lines.
Figure 1: Example of a cobweb graph for the logistic function.
Assuming we don’t know the actual fixed point r, we can approximate the error
by En = |xn+1 − xn|.
1
Problem 1:
Write your own program that computes the fixed point of a given function. Check
whether your program converges to a fixed point or if it diverges. The convergence
criterion for your program is given by the condition |xn+1 − xn| < tol, where tol is
a given tolerance. In case it diverges, it will run up to a maximum number N of
iterations. Print out a message in either case to say how many times your program
has run, and what solution has found. Use your program to solve the following:
(1.1) Do problem 17 of your homework by using your code. Use an initial guess
x0 = 1, a tolerance of 10−6
, and a maximum number of steps N = 50.
(1.2) Solve the fixed point iteration g(xn) = (2+ (xn − 2)2
), up to a tolerance 10−6
or maximum number of iterations N = 50, with (a) x0 = 2.8, (b) x0 = 3.1.
Which execution is better? And why?
(1.3) Solve the fixed point iteration g(xn) = (2+ (xn − 2)3
), up to a tolerance 10−6
or maximum number of iterations N = 50, with (a) x0 = 2.8, (b) x0 = 3.1.
Which execution is better? And why?
Extra Credit (+4):
For better visualization of the iterations, plot a cobweb graph in your program.
Root Finding: Newton’s Method
This method uses a recursive formula to find the approximated solutions, called
iterates, of an equation of the type f(x) = 0.
xn+1 = xn −
f(xn)
f
0
(xn)
, n = 0, 1, 2, . . . for f
0
(xn) 6= 0 (1)
In this case too, we consider an estimate for the error to be the difference between
to consecutive iterates Errn = |xn+1 − xn|. This method is not guaranteed to
converge if the initial guess x0 is not close enough to the root r we want to find.
2
Problem 2
Implement your own Newton’s method to solve Problems 19 and 21 in your Homework, with N = 50. Compare your results and the performance of your code with
the one obtained through the bisection method in the Lab 2 Problems 1 and 3,
respectively.
3

More products