Master Theorem for Solving Recurrences
Recurrences occur in a divide and conquer strategy of solving complex problems.
What does it solve?
- It solves recurrences of the form
T(n) = aT(n/b) + f(n)
.
a
should be greater than or equal to 1. This means that the problem is at least reduced to a smaller sub problem once. At least one recursion is needed.
b
should be greater than 1. Which means at every recursion, the size of the problem is reduced to a smaller size. If b
is not greater than 1, that means our sub problems are not of smaller size.
f(n)
must be positive for relatively larger values of n
.
Consider the below image:
Let's say we have a problem of size n
to be solved. At each step, the problem can be divided into a
sub problems and each sub problem is of smaller size, where the size is reduced by a factor of b
.
The above statement in simple words means that a problem of size n
can be divided into a
sub problems of relatively smaller sizes n/b
.
Also, the above diagram shows that at the end when we have divided the problems multiple times, each sub problem would be so small that it can be solved in constant time.
For the below derivation consider log
to the base b
.
Let us say that H
is the height of the tree, then H = logn
. The number of leaves = a^logn
.
- Total work done at Level 1 :
f(n)
- Total work done at Level 2 :
a * f(n/b)
- Total work done at Level 1 :
a * a * f(n/b2)
- Total work done at last Level :
number of leaves * θ(1)
. This is equal to n^loga
The three cases of the Master Theorem
Case 1:
Now let us assume that the cost of operation is increasing by a significant factor at each level and by the time we reach the leaf level the value of f(n)
becomes polynomially smaller than the value n^loga
. Then the overall running time will be heavily dominated by the cost of the last level. Hence T(n) = θ(n^loga)
.
Case 2:
Let us assume that the cost of the operation on each level is roughly equal. In that case f(n)
is roughly equal to n^loga
. Hence, the total running time would be f(n)
times the total number of levels.
T(n) = θ(n^loga * logn)
where k
can be >=0
. Where logn
would be the height of a tree for k >= 0
.
Note: Here k+1 is the base of log in logn
Case 3:
Let us assume that the cost of the operation on each level is decreasing by a significant factor at each level and by the time we reach the leaf level the value of f(n)
becomes polynomially larger than the value n^loga
. Then the overall running time will be heavily dominated by the cost of the first level. Hence T(n) = θ(f(n))
.
If you are interested in more detailed reading and examples to practice, visit my blog entry Master Method to Solve Recurrences