Let g C1 [a, b] and p be in (a, b) with g( p) = p

Question:

Let g ∈ C1 [a, b] and p be in (a, b) with g( p) = p and |g' ( p)| > 1. Show that there exists aδ > 0 such that if 0 < |p0 − p| < δ, then |p0 − p| < |p1 − p| . Thus, no matter how close the initial approximation p0 is to p, the next iterate p1 is farther away, so the fixed-point iteration does not converge if p0 ≠ p.
Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Numerical Analysis

ISBN: 978-0538733519

9th edition

Authors: Richard L. Burden, J. Douglas Faires

Question Posted: