1

following question, I understood the intuition behind how cutting down the size of input by square root on each iteration leads to O(log(log(n))) complexity.

I tried to derive it on paper.

Let T(n) = T($\sqrt{n}$) + c
$\implies$ T($n^{1/2}$)+2c
$\implies$ T($n^{1/4}$)+3c
$\implies$ T($n^{1/8}$)+4c
$\implies$ T($n^{1/16}$)+5c
.....upto n > 1

I noticed that the power of n becomes 1/2 times the last value.

How should I proceed from here on? I need to derive that T(n) is O(log(log(n))).

rsonx
  • 281
  • 1
  • 12

3 Answers3

2

Let $x = \log n$ and $Q(x) = T(2^x)$. You can rewrite your recurrence as follows: $$ Q(x) = T(2^x) = T(n) = T(n^\frac{1}{2}) + c = T(2^{x/2}) + c = Q(x/2) + c. $$

Which is easily solved using, e.g., the Master Theorem to obtain $Q(x) = \Theta(\log x)$. Substituting back: $$ T(n) = Q(x) = \Theta(\log x) = \Theta(\log \log n). $$

Steven
  • 29,724
  • 2
  • 29
  • 49
1

Define S(k) = $T(2^{2^k})$.

Then S(k) = $T(2^{2^k})$ = $T(2^{2^{k-1}}) + c$ = $T(2^{2^{k-2}}) + 2c$ = ... = $T(2^{2^{k-k}}) + k\cdot c$ = $T(2) + k\cdot c$.

gnasher729
  • 32,238
  • 36
  • 56
1

As you mention, you can show inductively that $T(n) = T(n^{1/2^k}) + kc$, with base case $T(n) = O(1)$ for $n \leq 2$ (say). It follows that $T(n) = \Theta(\ell)$, where $\ell$ is the minimal number such that $n^{1/2^\ell} \leq 2$. Taking a log, we get $\frac{\log n}{2^\ell} \leq 1$, or $\log n \leq 2^\ell$. Taking another log, we get $\log \log n \leq \ell$. Hence $\ell = \lceil \log\log n \rceil$, and it follows that $T(n) = \Theta(\log\log n)$.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514