I have been thinking recently about inherently sequential functions. In trying to wrap my head around them, I tried to come to the simplest possible function that looks inherently sequential to me. For didactical purposes, I'd like some feedback.
Consider two sequences of 64-bit integers: $a_1, \ldots, a_N$ and $b_1, \ldots, b_N$. Now, consider the sequence of 64-bit integers $x_0, \ldots, x_N$ defined by \begin{align} x_0 &= 0 \\ x_{n + 1} &= (x_n \oplus a_{n + 1}) + b_{n + 1} \end{align} where I use $\oplus$ to denote bitwise-XOR and $+$ to denote the sum modulo 64 bits. Intuitively, $x_N$ is obtained by a long sequence of alternating XORs and sums. I am wondering: is the computation of $x_N$ inherently sequential?
I cannot seem to easily see how one could achieve $x_N$ faster than just going through each operation. But, I realise that I might be missing something.
Here I am interested about both theory and practice: I'm curious to know if this function can be proved to be inherently sequential, but also if this function could be safely assumed to be inherently sequential, just like, e.g., cryptographic hash functions are safely assumed to approximate well a random oracle.
More specifically
I'll try to be more specific. What I am wondering is whether or not all bits in $x_N$ can be computed by a polynomially-sized boolean circuit whose critical depth is smaller than that of the trivial circuit, sequentially applying XORs and sums.