3

I was recently working on an old qualifying exam problem asking us to generalize Karger's randomized global min cut algorithm to that of a global min $k$-cut. I recalled the strategy of running a recursive version of the randomized edge contraction algorithm to help with the shrinking the error probability and ultimately help with an efficient algorithm for getting the global min $k$-cut with probability $1/2$.

Most of the analysis was going well until I started to analyze the probability of success of the recursive algorithm. I found that this probability was ultimately described as $$P_i \geq P_{i-1} - \frac{1}{4} P_{i-1}^2$$ where $P_{i}$ is the probability of success on some call of the algorithm at level $i$ of the recursion tree, with a leaf having $i = 0$. One could show that the depth of the tree was $d = \frac{c \log_2(n)}{(k-1)}$ for some constant $c$ and ultimately I was looking to lower bound $P_{d}$. However, I struggled to lower bound this recurrence. I knew from studying the algorithm for a min cut that this recurrence should have the lower bound $P_{d} = \Omega(1/\log(n))$. I wanted to try and show this but the best I got on my own was doing \begin{align} P_{i} &\geq P_{i-1} - \frac{1}{4} P_{i-1}^2 \\ &\geq P_{i-1}\left( 1 - \frac{1}{4}\right) \tag{ $P_{i-1} \leq 1$ } \\ &= \frac{3}{4} P_{i-1} \\ &\geq \left(\frac{3}{4}\right)^{i} P_{0} \end{align} which, when we evaluate $P_{d}$, gives us that \begin{align} P_{d} &\geq (3/4)^{d} P_0 \\ &= (4/3)^{-c \log_2(n)/(k-1)} \\ &= n^{-\frac{c}{(k-1) \log_{4/3}(2)}} \\ &\geq n^{-\frac{2.41 c}{(k-1)}} \end{align}

This is a much worse lower bound than what can be found. After this point, I investigated the original paper for the recursive randomized min cut algorithm. The trick they used was to perform a nonlinear change of variables to the recursion, something of the form $P_{k} = 4/Z_{k}$, where $Z_{k}$ is the new recurrence variable. This change leads to a new recurrence of the form $$Z_{k} \leq Z_{k-1} + 1 + 1 / (Z_{k-1} - 1)$$ Since $Z_{k} = 4 / P_{k} \geq 4$, this implies we can upper bound the above recurrence as $$Z_{k} \leq Z_{k-1} + 2 \leq 2 k + Z_{0} = 2 k + 4/P_{0}$$ which finally implies that indeed $P_{d} = \Omega(1/\log(n))$. I see that this transformation worked, but I do not have intuition for why one might think to try such a transformation. Can anyone help me with that intuition? Are there any other useful tips for tackling bounds of nonlinear recurrences?

spektr
  • 453
  • 2
  • 10

1 Answers1

4

First of all, let me explain why your estimate is lossy: when $P_i$ becomes small, $P_{i-1} - P_{i-1}^2/4$ is very close to $P_{i-1}$, and in particular, it doesn't decrease exponentially.

Since the function $x - x^2/4$ is monotone in the range $[0,1]$, if you solve the corresponding recurrence $Q_i = Q_{i-1} - Q_{i-1}^2/4$ (with base case $Q_0 = 1$), then $P_i \geq Q_i$.

Let us write the recurrence relation as $Q_i - Q_{i-1} = -Q_{i-1}^2/4$. This is a difference equation which can be approximated by a differential equation $q' = -q^2/4$, with $q(0) = 1$. Using standard techniques, you can determine the solution, which is $q(i) = 4/(4+i)$. This suggests that $Q_i \approx 4/(4+i)$, and so hints that $4/Q_i$ is a useful quantity to look at.

Here is another way to estimate the recurrence. Write the recurrence as $$ Q_i = (1-Q_{i-1}/4) Q_{i-1}. $$ In a range of $i$ where $Q_i \approx \theta$, it takes $\Theta(1/\theta)$ steps to reduce $Q_i$ by a factor of $1/2$. In particular, after $$ \Theta(1+2+\cdots+2^{\ell-1}) = \Theta(2^\ell) $$ steps, we reach $Q_i \approx 2^{-\ell}$. This suggests that $Q_i$ scales like $\Theta(1/i)$. With a little bit of work, this argument can be made completely formal, though it is hard to get a hold on the exact constant factor in this way.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514