Question: 1.9. Let f(t) = 1 -t, fk(t) = 1 -t+ (-llitk-1 , 1. Then f k never vanishes and converges unifonnly to f in

1.9. Let f(t) = 1 -t, fk(t) = 1 -t+ (-llitk-1 ,

° 1. Then f k never vanishes and converges unifonnly to f in [0, 2]. Let ,JTk denote the distinguished square root of f k in [0, 2]. Show that ,JTk does not converge in any neighborhood of t = 1. Why is Theorem 7.6.3 not applicable? [This example is supplied by E. Reich].

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Elementary Probability For Applications Questions!