22

Some complicated algorithms (union-find) have the nearly-constant inverse Ackermann function that appears in the asymptotic time complexity, and are worst-case time optimal if the nearly constant inverse Ackermann term is ignored.

Are there any examples of known algorithms with running times that involve functions that grow fundamentally slower than inverse Ackermann (e.g. inverses of functions that are not equivalent to Ackermann under polynomial or exponential etc. transformations), that give the best-known worst-case time complexity for solving the underlying problem?

Gilles 'SO- stop being evil'
  • 44,159
  • 8
  • 120
  • 184
user2566092
  • 1,741
  • 10
  • 18

3 Answers3

9

Seth Pettie came up with an algorithm for computing the sensitivity of a minimum spanning tree in time $O(m\log \alpha(m,n))$, improving on an algorithm of Tarjan which computes the same in time $O(m\alpha(m,n))$. (Compare this to Chazelle's $O(m\alpha(m,n))$ algorithm for computing the minimum spanning tree itself.) The sensitivity problem asks to compute, for a given graph and a given minimum spanning tree, by how much each edge weight can change without changing the minimum spanning tree.

(Thanks to Tsvi Kopelowitz for this reference.)

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514
6

The most comically slowly-growing function I've ever seriously seen used in a paper is $\alpha^*(n)$, the number of times you have to apply the Ackermann inverse to drop $n$ to some fixed constant. It's used in this paper on the deque conjecture on splay trees.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514
templatetypedef
  • 9,302
  • 1
  • 32
  • 62
-1

When Alan Turing discovered the computer, there used to be several models for the computer proposed. Turing proved that some (3) of these models could simulate each other AND compute the Ackermann function, whereas the other models could simulate each other but not the Ackermann function (so they could not simulate the 3). Therefore, these 3 models (Turing Machine, von Neumann and one I don't know), were chosen as the architecture for a computer. This does not mean the Ackermann function is the limit of the computer, but I suppose it's hard science. I'm not aware of any computable functions that grow faster than the Ackermann function.

Now, I don't think there's a practical problem that matches your question, but perhaps we can construct one. We need to put constraints on the input though. Since we can't use O(n), we cannot check the whole input. In fact, we cannot even check the length of the input as that would be O(log n). So, we need as first parameter a representation of the length of the rest of the input, for example c such that Ackermann(c) is the length of the input. Since this is also not suitable, we demand as first value in our input the parameter c, such that bb(c) is about the length of the input, where bb is the busy beaver function. This function is incomputable but bb(c) certainly exists. Then, the algorithm goes like:

for (int i=0; i<c; i++) {
    if (input[i] == null) {
        return false;
    }
}
return true;

The purpose of the algorithm is to check that if c is the inverse of bb, if then the input length is greater than bb(c).

If there's a computable function that grows faster than the Ackermann function, I think we can use it's inverse to create an algorithm that answers your question on any input.

Albert Hendriks
  • 2,481
  • 16
  • 35