46

I've been seeing all over stack Overflow, e.g here, here, here, here, here and some others I don't care to mention, that "any program that uses recursion can be converted to a program using only iteration".

There was even a highly upvoted thread with a highly upvoted answer that said yes it's possible.

Now I'm not saying they're wrong. It's just that that answer counters my meagre knowledge and understanding about computing.

I believe every iterative function can be expressed as recursion, and wikipedia has a statement to that effect. However, I doubt the converse is true. For one, I doubt non-primitive recursive functions can be expressed iteratively.

I also doubt hyper-operations can be expressed iteratively.

In his answer(which I don't understand by the way) to my question @YuvalFIlmus said that it's not possible to convert any sequence of mathematical operations into a sequence of additions.

If YF's answer is indeed correct (I guess it is, but his reasoning was above my head) then doesn't this mean that not every recursion can be converted into iteration? Because if it was possible to convert every recursion into iteration, I'd be able to express all operations as a sequence of additions.

My question is this:

Can every recursion be converted to iteration and why?

Please give an answer a bright highschooler or a first year undergrad will understand. Thank you.

P.S I don't know what primitive recursive is (I do know about the Ackermann function, and that it isn't primitive recursive, but is still computable. ALl my knowledge on it comes from the Wikipedia page on the Ackermann function.)

P.P.S: If the answer is yes, could you for example write an iterative version of a non-primitive-recursive function. E.g Ackermann in the answer. It'll help me understand.

Tobi Alafin
  • 1,647
  • 4
  • 17
  • 22

5 Answers5

55

It's possible to replace recursion by iteration plus unbounded memory.

If you only have iteration (say, while loops) and a finite amount of memory, then all you have is a finite automaton. With a finite amount of memory, the computation has a finite number of possible steps, so it's possible to simulate them all with a finite automaton.

Having unbounded memory changes the deal. This unbounded memory can take many forms which turn out to have equivalent expressive power. For example, a Turing machine keeps it simple: there's a single tape, and the computer can only move forward or backward on the tape by one step at a time — but that's enough to do anything that you can do with recursive functions.

A Turing machine can be seen as an idealized model of a computer (finite state machine) with some extra storage that grows on demand. Note that it's crucial that not only there isn't a finite bound on the tape, but even given the input, you can't reliably predict how much tape will be needed. If you could predict (i.e. compute) how much tape is needed from the input, then you could decide whether the computation would halt by calculating the maximum tape size and then treating the whole system, including the now finite tape, as a finite state machine.

Another way to simulate a Turing machine with computers is as follows. Simulate the Turing machine with a computer program that stores the beginning of the tape in memory. If the computation reaches the end of the part of the tape that fits in memory, replace the computer by a bigger computer and run the computation again.

Now suppose that you want to simulate a recursive computation with a computer. The techniques for executing recursive functions are well-known: each function call has a piece of memory, called a stack frame. Crucially, recursive functions can propagate information through multiple calls by passing variables around. In terms of implementation on a computer, that means that a function call might access the stack frame of a (grand-)*parent call.

A computer is a processor — a finite state machine (with a huge number of states, but we're doing computation theory here, so all that matters is that it's finite) — coupled with a finite memory. The microprocessor runs one giant while loop: “while the power is on, read an instruction from memory and execute it”. (Real processors are much more complex than that, but it doesn't affect what they can compute, only how fast and conveniently they do it.) A computer can execute recursive functions with just this while loop to provide iteration, plus the mechanism to access memory, including the ability to increase the size of the memory at will.

If you restrict the recursion to primitive recursion, then you can restrict iteration to bounded iteration. That is, instead of using while loops with an unpredictable running time, you can use for loops where the number of iterations is known at the beginning of the loop¹. The number of iterations might not be known at the beginning of the program: it can itself have been computed by previous loops.

I'm not going to even sketch a proof here, but there is an intuitive relationship between going from primitive recursion to full recursion, and going from for loops to while loops: in both cases, it involves not knowing in advance when you'll stop. With full recursion, this is done with the minimization operator, where you keep going until you find a parameter that satisfies the condition. With while loops, this is done by keeping going until the loop condition is satisfied.

¹ for loops in C-like languages can perform unbounded iteration just like while, it's just a matter of convention to restrict them to bounded iteration. When people talk about “for loops” in theory of computation, that means only loops that count from 1 to $n$ (or equivalent).

Gilles 'SO- stop being evil'
  • 44,159
  • 8
  • 120
  • 184
35

Every recursion can be converted to iteration, as witnessed by your CPU, which executes arbitrary programs using a fetch-execute infinite iteration. This is a form of the Böhm-Jacopini theorem. Moreover, many Turing-complete models of computation have no recursion, for example Turing machines and counter machines.

Primitive recursive functions correspond to programs using bounded iteration, that is, you have to specify the number of iterations that a loop is executed in advance. Bounded iteration cannot simulate recursion in general, since the Ackermann function isn't primitive recursive. But unbounded iteration can simulate any partially computable function.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514
26

As an example to the answer from Gilles, here is an "iterative" algorithm for the Ackermann function (using the common Ackermann-Péter version mentioned by Wikipedia $a(n,m)$).

We need a stack $s$ of integers.

Such a stack has two modifying operations, $\DeclareMathOperator{\push}{push}\push(s, x)$ (which puts a new element $x$ on the stack) and $\DeclareMathOperator{\pop}{pop} x ← \pop(s)$ which retrieves (and removes) the top element, and one querying operation $\DeclareMathOperator{\empty}{empty}\empty(s)$, which returns true if there are no elements left on the stack (false if there are more).

$\texttt{Ackermann}(n_0, m_0):$ $\def\ifop#1{\texttt{if(}#1\texttt{):}}$ $\def\elseif#1{\texttt{else if(}#1\texttt{):}}$ $\def\elsop{\texttt{else:}}$

  • $s = \emptyset $ (initialize empty stack)
  • $\push(s,n_0)$
  • $\push(s,m_0)$
  • $\texttt{while(true):}$
    • $m ← \pop(s)$
    • $\ifop{\empty(s)}$
      • $\texttt{return }m$ (this ends the loop)
    • $n ← \pop(s)$
    • $\ifop{n = 0}$
      • $\push(s, m+1)$
    • $\elseif{m=0}$
      • $\push(s, n-1)$
      • $\push(s, 1)$
    • $\elsop$
      • $\push(s, n-1)$
      • $\push(s, n)$
      • $\push(s, m-1)$

I implemented this in Ceylon. (The Ceylon Web IDE unfortunately doesn't work anymore.) (It outputs the stack at the start of each iteration of the loop.)

Of course, this just moved the implicit call stack from the recursion into an explicit stack with the parameters and return values.

Paŭlo Ebermann
  • 533
  • 1
  • 4
  • 12
16

There are already some great answers (which I can't even hope to compete with), but I'd like to pitch this simple explanation.

Recursion is just the manipulation of the runtime stack. Recursing adds a new stack frame (for the new invocation of the recursive function), and returning removes a stack frame (for the just-completed innovation of the recursive function). Recursion will cause some number of stack frames to be added/removed, until eventually they all exit (hopefully!) and result is returned to the caller.

Now, what would happen if you made your own stack, replaced recursive function calls with pushing to the stack, replaced returning from recursive functions with popping the stack, and had a while loop that ran until the stack was empty?

Alexander
  • 528
  • 2
  • 8
2

As far as I can tell, and in my own experience, you can implement any recursion as an iteration. As mentioned above, recursion uses the stack, which is conceptually unlimited, but practically limited (have you ever gotten a stack overflow message?). In my early days of programming (in the third quarter of the last century of the last millennium) I was using non-recursive languages implementing recursive algorithms and had no problems. I am not sure of how one would prove it, though.