Is it possible to solve the recurrence relation
T(n) = √n T(√n) + n
Using the Master Theorem? It is not of the form
T(n) = a ⋅ T(n / b) + f(n)
but this problem is given in the exercise of CLRS chapter 4.
Is it possible to solve the recurrence relation
T(n) = √n T(√n) + n
Using the Master Theorem? It is not of the form
T(n) = a ⋅ T(n / b) + f(n)
but this problem is given in the exercise of CLRS chapter 4.
This cannot be solved by the Master Theorem. However, it can be solved using the recursion tree method to resolve to O(n log log n).
The intuition behind this is to notice that at each level of the tree, you're doing n work. The top level does n work explicitly. Each of the √n subproblems does √n work for a net total of n work, etc. So the question now is how deep the recursion tree is. Well, that is the number of times that you can take the square root of n before n gets sufficiently small (say, less than 2). If we write
n = 2lg n
then on each recursive call n will have its square root taken. This is equivalent to halving the above exponent, so after k iterations we have that
n1/(2k) = 2lg n/(2k)
We want to stop when this is less than 2, giving
2lg n/(2k) = 2
lg n/(2k) = 1
lg n = 2k
lg lg n = k
So after lg lg n iterations of square rooting the recursion stops. And, since at each level the recursion does O(n) work, the total runtime is O(n lg lg n).
More generally, just as any algorithm that repeatedly cuts its input size in half should get you thinking "log n," any algorithm that repeatedly cuts its input size down by taking a square root should get you thinking "log log n.". van Emde Boas trees use this recurrence, for example.
Interestingly, this recurrence is used to obtain the runtime of a famous algorithm for solving the closest pair of points problem deterministically assuming that the computer can take the floor of an arbitrary real number in constant time.
© 2022 - 2024 — McMap. All rights reserved.