8

I am stuck by analyzing the time complexity of the following algorithm:

def fun (r, k, d, p):
    if d > p:
        return r
    if d = 0 and p = 0:
        r <- r + k
        return r
    if d > 0:
        fun (r, k + 1, d - 1, p)
    if p > 0:
        fun (r, k - 1, d, p - 1)

The root call will be fun (0, 0, n, n), and n is the size of the problem.

I guess that: The recurrence relation is $ T(n, n) = T(n-1, n) + T(n, n-1)$, which is equivalent to $T(2n) = 2T(2n-1) \iff T(m) = 2T(m - 1)$, and so $O(2^m) \iff O(4^n)$.

Is my analysis correct (I know it's not very complete and exact)? If it does have serious flaw, please point it out or show me a correct and complete proof on the time complexity of this algorithm.

WIZARDELF
  • 301
  • 1
  • 9

2 Answers2

10

The only two arguments relevant to asymptotic analysis are $d$ and $p$. These arguments (virtually) satisfy $d,p \geq 0$ and $d \leq p$ (we need to shuffle the logic in the function slightly to get this). At each point in the execution, you take the current pair $(d,p)$ and then recursively call the function with the pairs $(d-1,p),(d,p-1)$, avoiding pairs which invalidate the constraints stated above.

We can picture the resulting call tree as a path starting at $(0,0)$. Each time you decrease $p$, add a / step. Each time you decrease $d$, add a \ step. The condition $d \leq p$ guarantees that you never go below the X axis. Moreover, you have a "budget" of $n$ of each step. The total number of leaves in this call tree is exactly the Catalan number $\binom{2n}{n}/(n+1) = \Theta(4^n/n^{3/2})$, and this gives us a lower bound on the running time of the function.

To get an upper bound, note that on the way to each leaf we pass through $2n$ nodes, and this gives an upper bound $2n$ larger than the lower bound, i.e., $\Theta(4^n/\sqrt{n})$.

We have a lower bound of $\Omega(4^n/n^{3/2})$ and an upper bound on $O(4^n/\sqrt{n})$. What are the exact asymptotics? They grow like the total number of paths not crossing the X axis which have at most $n$ steps in each direction. Using Bertrand's ballot theorem we can get an exact expression for this: $$ \sum_{0 \leq d \leq p \leq n} \frac{p-d+1}{p+1} \binom{p+d}{p}. $$ It thus remains to estimate this sum asymptotically: $$ \sum_{0 \leq d \leq p \leq n} \binom{p+d}{p} - \sum_{0 \leq d \leq p \leq n} \frac{d}{p+1} \binom{p+d}{d} = \\ \sum_{0 \leq d \leq p \leq n} \binom{p+d}{p} - \sum_{0 \leq d \leq p \leq n} \binom{p+d}{p+1} = \\ \sum_{p=0}^n \binom{2p+1}{p+1} - \sum_{p=0}^n \binom{2p+1}{p+2} = \\ \sum_{p=0}^n \frac{1}{p+1} \binom{2p+2}{p} = \Theta\left(\sum_{p=0}^n \frac{4^p}{p^{3/2}}\right) = \Theta\left(\frac{4^n}{n^{3/2}}\right). $$

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514
0

Case by case:

  1. d > p: Constant time
  2. d=0 ∧ p=0: Constant time
  3. d > 0: Note that d ≯ p, so we have 0 < d ≤ p, and fun recurses on d-1 until d ≯ 0; since p > 0, this is Linear in d + (case 4).
  4. p > 0: Note that d ≯ 0, so we have d ≤ 0 ≤ p (with d < p), and fun recurses on p-1 until p ≯ 0; this is Linear in p + (one of case 1, 2, or 5)
  5. d ≤ p < 0: Undefined; I'm assuming this is Constant time

Starting with d = p = n > 0 hits case 3, which is followed by case 4. If n is a whole number, the final case is 2, otherwise the final case is 5. The total time for those cases is d+p+1, or 2n+1.

ShadSterling
  • 203
  • 2
  • 9