3

How would one go about classifying the time complexity of Ackermann's function, and can we say that all primitive-recursive functions are asymptotically bounded by the complexity of the Ackermann function as an asymptotic upper bound?

Edit: I want to make it clear that I am interested in the time complexity of computing Ackermman's function and whether the time complexity of computing any primitive recursive function is asymptotically bounded by the time complexity of computing Ackermman's function. I am not interested in the actual values of those functions.

I am asking because I remember seeing an assertion of my second question in an old book but no proof was shown and no time complexity for computing Ackermman's function was listed, so the assertion was by no means rigorous.

2 Answers2

7

You are asking (or at least typing) multiple questions that are unrelated.

Time Complexity of the Ackermann function

First, understand that this is not a meaningful query. The Ackermann function, let's call it $A$ (assume a one-parameter variant), is a function, not an algorithm. You can certainly talk about $\Theta(A)$.

What you likely mean is the time $T_A(n)$ required to compute $A(n)$. I do not know of any algorithm besides explicitly computing the recurrence; the runtime of this approach can be investigated using the standard approach. It's unlikely that the result will be a meaningful $\Theta$-asymptotic in terms of the "usual" basic functions; you'd get runtime bounds expressed in $A$ itself.

That does not mean that there is no faster algorithm; it would not contradict the growth of $A(n)$ per se. Depending on your computational and cost model, you can compute huge numbers with little cost.

Are all primitive-recursive functions in $O(A)$?

Yes. That is (a consequence of) how we show that $A$ is not primitive recursive.

Note that here we are back to comparing values of the functions again; no mention of "time complexity" is being made.

Can all primitive-recursive functions be computed in time $O(T_A(n))$?

Not knowing $\Theta(T_A)$, I do not have an answer to this question.

Raphael
  • 73,212
  • 30
  • 182
  • 400
1

On the phone, so forgive brevity, however, I think you're looking for the concept of tetration and the arrow notation.

With those concepts in hand, I think runtime complexity of Ackermann function ($A~a~b$), as defined in Structure and Interpretation of Computer Programs book, has the time complexity of $O (n (↑))$ where $n(↑)$ means $n$ number of arrows and $n = a - 2$. $(A~1~x)$ will take $x$ steps. $(A~2~x)$ will take $2x$ steps, $(A~3~x)$ will take $2↑x$ or $2^x$ steps. $(A~4~x)$ will take $2↑↑x$ steps, and so on.

I'm not certain this is exactly right, but this is the neighborhood. I'm just wrapping my head arround arrow notation in this context.

So, regarding whether Ackermann function (as implemented in SICP) can be used as an upper bound on some other algorithm, I think can be reduced to the question if primitive recursive algorithms are upper bounded by $O(n(↑))$

Tristan
  • 111
  • 3