Assume that f has at least two continuous derivatives on an interval containing a with f'(a) =
Question:
Assume that f has at least two continuous derivatives on an interval containing a with f'(a) = 0. Use Taylor’s Theorem to prove the following version of the Second Derivative Test.
a. If f"(x) > 0 on some interval containing a, then f has a local minimum at a.
b. If f"(x) < 0 on some interval containing a, then f has a local maximum at a.
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Calculus Early Transcendentals
ISBN: 978-0321947345
2nd edition
Authors: William L. Briggs, Lyle Cochran, Bernard Gillett
Question Posted: