Construct a sequence of interpolating values yn to f (1 + 10), where f (x) = (1

Question:

Construct a sequence of interpolating values yn to f (1 + √10), where f (x) = (1 + x2)−1 for −5 ≤ x ≤ 5, as follows: For each n = 1, 2, . . . , 10, let h = 10/n and yn = Pn(1+√10), where Pn(x) is the interpolating polynomial for f (x) at the nodes x0(n) , x1(n) , . . . , xn(n) and xj(n) = −5 + jh, for each
j = 0, 1, 2, . . . , n. Does the sequence {yn} appear to converge to f (1 +√10)?
Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Numerical Analysis

ISBN: 978-0538733519

9th edition

Authors: Richard L. Burden, J. Douglas Faires

Question Posted: