In this exercise we briefly introduce the Master Theorem. (For more on this result, including a proof,

Question:

In this exercise we briefly introduce the Master Theorem. (For more on this result, including a proof, we refer the reader to pp. 73-84 of reference [5] by T. H. Cormen, C. E. Leiserson, R. L. Rivest, and C. Stein.)
Consider the recurrence relation
f(D = l,
f(n) = af(n/b) + h(n),
where n ∈ Z+, n > 1, a ∈ Z+, a < n, and b ∈ R+, 1 < b < n. The function h accounts for the time (or cost) of dividing the given problem of size n into a smaller (similar) problems of size approximately n/b and then combining the results from the a smaller problems. Further, there exists k ∈ Z+ such that h(n) > 0 for all n ≥ k. (Since n/b need not be an integer, the recurrence relation is not properly defined. To get around this we need to replace n/b by either [n/b] or [n/b]. But as this does not affect the outcome of the result, for large values of n, we shall not concern ourselves with such details.)
Under the above hypothesis we find the following [where ⊝ (big theta) and Ω (big omega) are as given in Exercises 11-16 for Section 5.7]:
(i) If h ∈ 0(nlogb a-∈), for some fixed e > 0, then f ∈ Θ(nlogb a);
(ii) If h ∈ Θ (nlogb a), then f ∈ Θ(nlogb a log2 n); and
(iii) If h ∈ Θ(nlogba+∈) for some fixed ∈ > 0, and if a h{n/b) ≤ c h(n), for some fixed c, where 0 < c < 1, and for all sufficiently large n, then f ∈ Θ(h).
In all three cases, the function h is compared with nlogb a and, roughly speaking, the Master Theorem then determines the complexity of the solution f(n) as the larger of the two functions in cases (i) and (iii), while in case (ii) we find the added factor log2 n. However, it is important to realize that there are some recurrence relations of this type that do not fall under any of these three cases.
For now we consider the following, where f(1) = 1 for all three examples.
(1) f(n) = 16f(n/4) + n Here a = 16, b = 4, nlogba = nlog4 16 = n2, and h(n) = n. So h ∈ 0(nlog4 16-e) with ∈ = 1. Consequently, h falls under the hypothesis for case (i) and it follows that f ∈ Θ (n2).
(2) f(n) = f(3n/4) +5 Now we have a = 1, b = 4/3, nlogb a = nlog4/3 1 = n° = 1, and h(n) = 5. Consequently, h ∈ Θ(nlog4/31) and from case (ii) we learn that f ∈ Θ(nlog4/31 log2 n) = Θ(log2 n).
(3) f(n) = If (n/8) + n log2 n For this recurrence relation we have a = 1, b = 8, nlogb a = nlog8 7 = n0 936, and h(n) = n log2 n. So h ∈ Ω(nlog8 7+∈), where ∈ = 0.064 > 0. Further, for all sufficiently large n, a h(n/b) = 7(n/8) log2(n/8) = (7/8)n[log2 n - log2 8] ≤ (7/8)n log2 n = c h(n), for 0 < c = 7/8 < l. Thus, h satisfies the hypotheses for case (iii) and we have f ∈ Θ (n log2 n).
Use the Master Theorem to determine the complexity of f in each of the following, where f(1) = 1:
(a) f(n) = 9f (n/3) + n
(b) f(n) = 2f(n/2) + 1
(c) f(n) = f(2n/3) + 1
(d) f(n) = 2f(n/3) + n
(e) f(n) = 4f(n/2) + n2
Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Question Posted: