7

A recursive function on $\mathbb{N}$ can be defined as follows:

Given an element $a \in \mathbb{N}$ and a function $f:\mathbb{N}\rightarrow\mathbb{N}$, we can define a function $g:\mathbb{N}\rightarrow\mathbb{N}$ as follows:

  1. $g(0) = a$
  2. $g(n+1) = f(g(n))$

The fact that $g$ exists and is unique can be proved using the Recursion Theorem in set theory.

My understanding is that in Peano Arithmetic, it is possible prove the existence of exponentiation using just addition and multiplication, but is it true for any recursively defined function?

In other words, is the existence of addition and multiplication strong enough to prove the existence of any recursive function? i.e can the Recursion Theorem be proved using the Peano Axioms (including addition and multiplication)?

Update: Since the answers below indicate that this is true, I am looking for an outline of a proof or a reference that shows that addition and multiplication are powerful enough to define any recursive function.

2 Answers2

2

$PA$ can indeed prove the recursion theorem, although it's a bit of work to formalize it properly since $PA$ doesn't talk directly about functions. It's much better to work in the theory $ACA_0$, a conservative two-sorted extension of $PA$ which does talk directly about functions.


However, this isn't really the end of the story. It's not that $ACA_0$ proves the existence of exponentiation, but rather that PA proves the totality of exponentiation. And, in fact, $ACA_0$ does not prove the totality of every recursive function! This is a somewhat different issue, and I suspect that this is what you are really interested in.

Noah Schweber
  • 260,658
  • Thanks this is interesting. Can you elaborate or provide references that I can read on how one might go about proving the recursion theorem in PA, and and what you mean by existence and totality? – Kostub Deshmukh Jan 21 '16 at 19:40
  • @KostubDeshmukh So as I said, even formulating the recursion theorem in PA is some work. I would first look at $ACA_0$; Simpson's book "Subsystems of Second-order arithmetic" describes it in detail, and it's not hard to see how the standard proof of the recursion theorem goes through unchanged in $ACA_0$. For $PA$, I don't know of a canonical source, but it's essentially the same: once you state it properly, it follows immediately from induction. (But you need to use $+$ and $\times$ in the process of stating it properly.) – Noah Schweber Jan 21 '16 at 19:41
1

Over first-order Peano Arithmetic, Recursion can be proven as a metatheorem which quantifies over arithmetical formulae. Stated formally, it is the following assertion.

Metatheorem: Let $\phi(x,y)$ be any formula in the language of arithmetic, then there's another formula $\psi(x,y)$ such that the following is a theorem of Peano Arithmetic.

$$\begin{align} (\forall x, \exists!y, \phi(x,y)) \implies \forall a,\big[&(\forall x, \exists!y,\psi(x,y)) \\ &\land \psi(0,a) \\ &\land \forall(x,y,z), (\psi(x,y)\land\phi(y,z))\implies\psi(x+1,z)\big] \end{align}$$

This is quite complicated, but essentially it just says: if $\phi$ defines a function, then $\psi$ also defines a function, and moreover those functions obey the requisite recursive relation. If we think of $\phi$ as encoding the graph of a function $F$, and $\psi$ encoding the graph of a function $G$, then the above metatheorem can be loosely summarized as follows.

Metatheorem: Let $F:\mathbb{N}\to\mathbb{N}$ be any arithmetically definable function, then there's another arithmetically definable function $G:\mathbb{N}\to\mathbb{N}$ such that the following is proven by Peano Arithmetic. $$G(0)=a \land \forall x, G(x+1) = F(G(x))$$

As a remark, the formulae $\phi$ and $\psi$ are allowed to have extraneous free variables, which are considered to be constant during the recursion. The variable $a$ is also considered to be an implicit free variable of $\psi$.


Part 1: Interpreting Finite Sequences

The proof of the metatheorem is broken up into two parts. The first part is extremely complicated: we need some method by which Peano Arithmetic can reason about finite number sequences. This is possible with the help of Godel's Beta function. The related proofs are highly technical, so I won't prove them here, but you can check this answer which offers some insight into how and why the beta function works. In summary, the following theorem is proven about Godel's beta function.

Theorem: $\forall(S_1,S_2), \forall x_0,\forall y_0, \exists(Z_1,Z_2), \beta(Z_1,Z_2, x_0)=y_0 \land \forall(x<x_0), \beta(Z_1,Z_2,x)=\beta(S_1,S_2,x)$

To simplify, we can think of the pair $(S_1,S_2)$ as encoding a sequence $S$, where we define the indexing operation as $S[x]:=\beta(S_1,S_2,x)$. The above theorem thus asserts that, for any sequence $S$ and pair $x_0,y_0$, there exists a modified sequence $Z$ obeying $Z[x_0]=y_0$ and $Z[x]=S[x]$ for all $x<x_0$. As a consequence, it's possible to prove the following metatheorem, which codifies the fact that every possible finite sequence is encodable by some object $S$ using our indexing notation.

Metatheorem: For any arithmetically definable function $F:\mathbb{N}\to\mathbb{N}$, the following holds. $$\forall N, \exists S, \forall(x<N), S[x]=F(x)$$


Part 2: Performing the Recursion

This part is actually much easier. To define the value $G(x)$, we just need a sequence $S$ such that $S[0]=a$ and generally $S[n+1]=F(S[n])$ for any $n<x$. In essence, $S$ encodes a finite initial segment of what $G$ is supposed to look like. We can then define $G(x)=S[x]$, and rest assured that $G(x+1)=F(G(x))$. The formula $\psi$ defining the graph of $G$ is produced as follows. $$\psi(x,y) :\equiv \exists S, S[x]=y\land S[0]=a\land\forall(n<x), S[n+1]=F(S[n])$$

To prove that $\psi$ defines a function, we just show that for each $x$, the corresponding value of $y$ exists and is unique.

The existence of $y$ is proven by induction. Existence when $x=0$ is trivial, since we can find a sequence $S$ obeying $S[0]=a$ which witnesses the fact that $\psi(0,a)$. For the inductive step, suppose we have $y$ obeying $\psi(x,y)$, then find the witness sequence $S$ obeying $y=S[x]$. Next, construct an extended sequence $Z$ obeying $Z[x+1]=F(S[x])$ and $Z[n]=S[n]$ for all $n\leq x$. The extended sequence $Z$ obeys all our requirements, and thus witnesses the fact $\psi(x+1,Z[x+1])$, equivalently $\psi(x+1,F(y))$. This concludes the proof by induction, that $\forall x, \exists y, \psi(x,y)$.

To prove that $y$ is unique to $x$, we have a similar induction. The base case is trivial: if $\psi(0,y)$ then $y=S[0]=a$, hence $y=a$. For the inductive step, assume the $y$ satisfying $\psi(x,y)$ is unique, then let $z$ satisfy $\psi(x+1,z)$ with witness sequence $S$. It almost immediately follows that $\psi(n,S[n])$ for any $n\leq x+1$, so in particular $\psi(x,S[x])$. The uniqueness of $y$ implies that $y=S[n]$, and consequently $z=S[n+1]=F(S[n])=F(y)$. Since $z=F(y)$ and $y$ is unique, then $z$ is also unique. This concludes the proof of uniqueness $\forall x, \exists!y, \psi(x,y)$ by induction.

To close off the proof, let $G$ be the function encoded by $\psi$. We proved $\psi(0,a)$, hence $G(0)=a$. We also proved the implication $\psi(x,y)\implies \psi(x+1,F(y))$, therefore $G(x+1)=F(G(x))$ for all $x$. With the desired recursive relation satisfied, this concludes the proof of the Recursion theorem.


Part 3: An Explicit Formula

This is a bonus, and can be skipped if you want. The above is all highly abstracted, but it's important to know that $\psi$ is expressible entirely in the language of arithmetic. After all, that's the entire point of doing this in Peano Arithmetic. Below, I slowly tease out an explicit definition for $\psi$, which is done by unraveling all the relevant definitions. First, we rewrite the function symbol $F$ using its defining formula $\phi$, hence $\psi(x,y)$ takes the following form. $$\exists S, S[x]=y\land S[0]=a\land\forall(n<x), \phi(S[n],S[n+1])$$

Next, we rewrite my indexing notation using Godel's Beta function.

$$\exists(S_1,S_2), \beta(S_1,S_2,x)=y\land \beta(S_1,S_2,0)=a\land\forall(n<x), \phi(\beta(S_1,S_2,n),\beta(S_1,S_2,n+1))$$

If you can see where this is going, feel free to stop reading, as it's about to become much messier. The next step is to rewrite the Beta function using its definition in terms of the remainder function. Namely, we have the definition $\beta(x_1,x_2,x_3) = \operatorname{rem}(x_1,x_3 x_2+x_2+1)$. Unraveling this definition makes $\psi(x,y)$ take the following form.

$$\begin{align} \exists(S_1,S_2), &\operatorname{rem}(S_1,x S_2+S_2+1)=y \land \operatorname{rem}(S_1, S_2+1)=a \\ &\land \forall(n<x), \phi(\operatorname{rem}(S_1,nS_2+S_2+1),\operatorname{rem}(S_1,(n+1)S_2+S_2+1)) \end{align}$$

To unravel the remainder function, we first want to extract it from the formula $\phi$, since we don't want to fiddle with the structure of $\phi$. $$\begin{align} \exists(S_1,S_2), &\operatorname{rem}(S_1,x S_2+S_2+1)=y \land \operatorname{rem}(S_1, S_2+1)=a \\ \land &\forall(n<x), \exists(r_3,r_4), \phi(r_3,r_4) \\ &\land r_3=\operatorname{rem}(S_1,nS_2+S_2+1) \\ &\land r_4=\operatorname{rem}(S_1,(n+1)S_2+S_2+1) \end{align}$$

The last step is to actually unravel the definition of the remainder function. To do this, we use the fact that $r=\operatorname{rem}(x,y)$ precisely when $r<y \land \exists q, qy+r=x$. We have four applications of the remainder function, so the resulting formula becomes quite bloated.

$$\begin{align} \exists(S_1,S_2), &y<(xS_2+S_2+1)\land (\exists q_1, q_1\cdot(xS_2+S_2+1)+y = S_1) \\ \land &a<(S_2+1)\land (\exists q_2, q_2\cdot(S_2+1)+a=S_1) \\ \land &\forall(n<x), \exists(r_3,r_4), \phi(r_3, r_4) \\ &\land (r_3<nS_2+S_2+1)\land \exists q_3, q_3\cdot(nS_2+S_2+1)+r_3 = S_1 \\ &\land (r_4<(n+1)S_2+S_2+1)\land\exists q_4, q_4\cdot((n+1)S_2+S_2+1)+r_4=S_1 \end{align}$$

And that's it. We can see by inspection that this is entirely expressed in the language of arithmetic... Or at least, it would be if $<$ were in the language of arithmetic. To rewrite that, we'd need to use the definition $x<y$ given by $\exists k, x+k+1=y$. We'd also need to rewrite the bounded quantifiers, since the expression $\forall(x<y), \xi$ is just an abbreviation for $\forall x, x<y \implies \xi$. I don't want to do all that, so I'll call it quits here.... Just kidding, of course I'll do it.

$$\begin{align} \exists S_1,\exists S_2, &(\exists k_1, y+k_1+1=xS_2+S_2+1) \\ \land &\exists q_1, q_1\cdot(xS_2+S_2+1)+y = S_1 \\ \land &\exists k_2, a+k_2+1=S_2+1 \\ \land &\exists q_2, q_2\cdot(S_2+1)+a=S_1 \\ \land &\forall n, (\exists k_0, n+k_0+1=x) \implies \big[\exists r_3, \exists r_4, \phi(r_3, r_4) \\ &\land \exists k_3, r_3+k_3+1=nS_2+S_2+1 \\ &\land \exists q_3, q_3\cdot(nS_2+S_2+1)+r_3 = S_1 \\ &\land \exists k_4, r_4+k_4+1=(n+1)S_2+S_2+1 \\ &\land \exists q_4, q_4\cdot((n+1)S_2+S_2+1)+r_4=S_1\big] \end{align}$$

Ok, now we are done. The above is an explicit arithmetical formula defining $\psi(x,y)$. The fact that simple addition and multiplication can express such complex concepts is truly a marvel. The complexity of the resulting formula, however, is exactly why no one actually cares about the explicit formula. It's possible in principle, and that's enough to satisfy any theorist.

Jade Vanadium
  • 5,046
  • 10
  • 25