Solving the recurrence relation T(n) = √n T(√n) + n [closed]
Asked Answered
C

1

8

Is it possible to solve the recurrence relation

T(n) = √n T(√n) + n

Using the Master Theorem? It is not of the form

T(n) = a ⋅ T(n / b) + f(n)

but this problem is given in the exercise of CLRS chapter 4.

Caryloncaryn answered 2/9, 2011 at 7:11 Comment(1)
A similar question: math.stackexchange.com/questions/689887/…Minivet
S
24

This cannot be solved by the Master Theorem. However, it can be solved using the recursion tree method to resolve to O(n log log n).

The intuition behind this is to notice that at each level of the tree, you're doing n work. The top level does n work explicitly. Each of the √n subproblems does √n work for a net total of n work, etc. So the question now is how deep the recursion tree is. Well, that is the number of times that you can take the square root of n before n gets sufficiently small (say, less than 2). If we write

n = 2lg n

then on each recursive call n will have its square root taken. This is equivalent to halving the above exponent, so after k iterations we have that

n1/(2k) = 2lg n/(2k)

We want to stop when this is less than 2, giving

2lg n/(2k) = 2

lg n/(2k) = 1

lg n = 2k

lg lg n = k

So after lg lg n iterations of square rooting the recursion stops. And, since at each level the recursion does O(n) work, the total runtime is O(n lg lg n).

More generally, just as any algorithm that repeatedly cuts its input size in half should get you thinking "log n," any algorithm that repeatedly cuts its input size down by taking a square root should get you thinking "log log n.". van Emde Boas trees use this recurrence, for example.

Interestingly, this recurrence is used to obtain the runtime of a famous algorithm for solving the closest pair of points problem deterministically assuming that the computer can take the floor of an arbitrary real number in constant time.

Sharika answered 2/9, 2011 at 7:32 Comment(5)
Thanks !! I did try solving it by recurrence tree but totally missed the point that at each level cost will be constant = n !! :)Caryloncaryn
Actually, you may be able to use the Master Theorem if you rewrite n = 2^(2^k). In which case, T(n) = √n T(√n) + n becomes: T(2^(2^k)) = 2^(2^k-1) T(2^(2^k-1)) + 2^(2^k). However, the 'a' & 'b' are not constant here. I think there should be a way, but can't think of any right now.Keelung
Thanks for the great explanation. When deriving log log n, I thought it was less confusing to start with n^(1/(2^k)) = 2, then raise both sides to (2^k), resulting in n = 2^(2^k), which then becomes log log n = k as above.Antimony
whatever you said is for T(sqrt n) but there is a (sqrt n) in multiplication with T(sqrt n).Wont that be considered?Expulsion
T(n) = 4*T(sqrt(n)) + n can we use the above procedure for this problem also? link for the question #13458899Expulsion

© 2022 - 2024 — McMap. All rights reserved.