4

Given $n$ and $k$ I would like to show that the number of solutions of the equation $$x_1+x_2+\dots + x_k = r \pmod{n}$$ where the $x_i$'s are integers satisfying $0<x_i<n$ is equal for all values of $r\in \{1,2,\dots, n-1\}$. For context, this was part of an online assessment test from a company for a software engineering position. During the test, I managed to calculate the number of solutions for $r=0$ by using a roots of unity filter and guesstimated that for all other $r$ it must be equal. Then since the sum over $r$ of the number of solutions is equal to $(n-1)^k$ I managed to get the answer by solving a simple equation. However, I would like to prove that my guesstimation is indeed correct. Using a roots of unity filter gives a messy calculation when $r\neq n$ which I could not easily simplify. Then I tried to find a bijection between the set of solutions for $r$ and $r'$ but again this approach was not fruitful. Any help is appreciated!

EDIT: Here is my complete thought process. Let $$P(x) = x + x^2 + \dots + x^{n-1}$$ Let $G(x)=P(x)^k = a_0 + a_1x + a_2x^2 + \dots$ where the coefficients of $x^m$ where $m>\deg(G)$ are zero. Then the answer for any $r>0$ is equal to $a_0 + a_r + a_{2r} + \dots$ (here $a_0 = 0$). For $r=0$ the answer is equal to $a_0+a_n+a_{2n}+\dots$. These can be calculated exactly by root of unity filters. That is, for $r>0$ we have $$a_0+a_r+a_{2r}+\dots = \frac{1}{r}\left(G(1)+G(\omega)+\dots + G(\omega^{r-1})\right)$$ where $\omega$ is an $r$-th root of unity. For $r=0$ we take $r=n$ in the above formula which simplifies nicely to $$\frac{1}{n}\left((n-1)^k - (-1)^k\right)$$ Now if $r\neq n$ there is no obvious simplification.

  • How large is $n$? – Erick Wong Oct 25 '24 at 02:39
  • I would think that inclusion-exclusion can be used to easily reduce the "excluding zeros" question to different sizes of the "zeros included" problem. – Erick Wong Oct 25 '24 at 02:41
  • @Erik Wong Well you're right I think but I'm looking for a more direct approach (ideally I would like to construct a bijection) – George Giapitzakis Oct 25 '24 at 02:42
  • Note: Your approach is doomed because it is not true. EG For $n = 4, k = 2$, when $ r= 0$ we have 3 solutions $(1, 3), (2, 2), (3, 1)$. But when $ r=1$ we only have 2 solutions $(2, 3), (3, 2)$. – Calvin Lin Oct 25 '24 at 07:55
  • 1
    Note that the statement "Then the answer for any $r>0$ is equal to $a_0 + a_r + a_{2r} + \dots$ (here $a_0 = 0$)." is not true. The true version is $ a_r + a_{n+r} + a_{n+2r} + \ldots$, for which you can use the appropriate roots of unity filter (I don't know if you're aware of how to do this. The approach is reasonably direct from here.) – Calvin Lin Oct 25 '24 at 08:52
  • Since multiplication by $a\in(\mathbb Z/n\mathbb Z)^\times$ is a bijection between the solution sets for $r$ and $ar$, it is a priori clear that the number of solutions may only depend on $\gcd(n,r)$. – Emil Jeřábek Jun 17 '25 at 07:34

6 Answers6

5

I believe this answer is much simpler than all the others.

Let $f(n,k,r)$ be the number of solutions for $x_1+x_2+\ldots+x_k=r\mod n$. We prove by induction that $f(n,k,r)$ is the same for all $r\neq 0$.

Basis: we have $f(n,1,r)=1$ if $r\neq 0$ and $f(n,1,0)=0$. For $k\geq 2$, suppose $r\neq 0$. Consider a solution $x_1,x_2,\ldots, x_k$. If the first $k-1$ terms sum to $0$, then $x_k=r$. If the first $k-1$ terms sum to $r$, then there is no solution. In all other $n-2$ cases, there is one option for $x_k$. Since $f(n,k-1,r)$ is equal to $f(n,k-1,1)$ for all $r\neq 0$ by hypothesis, we have $f(n,k,r)=f(n,k-1,0)+(n-2)f(n,k-1,1)$. This, does not depend on $r$ so we completed the induction step. This concludes the proof.

Note: this gives us the recursive relations to compute the function: $$ f(n,k,1)=f(n,k-1,0)+(n-2)f(n,k-1,1)\\ f(n,k,0)=(n-1)f(n,k-1,1) $$ This yields $f(n,k,1)=\frac{1}{n}(n-1)^k-\frac{(-1)^k}{n}$.

  • 1
    Very nice. I also came up with that approach, motivated by the counting the solutions through linear recurrence. As I've seen this problem before, I chose to focus on OP's approaches (bijection and roots of unity filter). – Calvin Lin Oct 25 '24 at 13:33
  • 1
    Very nice answer! I am marking the one that uses the roots of unity filter as accepted because this is what I had in mind initially. – George Giapitzakis Oct 25 '24 at 16:07
3

We establish the result via fixing the various errors in OP's roots of unity filter argument. See this writeup for the more general version, which allows for $r \neq 0$.

Let $P(x) = x + x^2 + \dots + x^{n-1} $.
Let $ \omega$ be a primitive n-th root of unity.
Show that $P(1) = n-1$, $P(\omega^r) = - 1$ for $ r \neq n$ (or a multiple of $n$).
Let $G(x)=P(x)^k = a_1x + a_2x^2 + \dots$.
Let $G_r = \sum a_{in + r} $.
We want to show that $G_r$ is constant for $r \neq 0$ (or a multiple of $n$).

Applying the roots of unity filter for $r \neq 0$,

$$ \begin{array} { l l } G_r & = \frac{ \omega^{-r} G(\omega) + \omega^{-2r} G(\omega^2) + \ldots \omega^{-(n-1)r} G(\omega^{n-1}) + \omega^{-nr}G(1) } { n } \\ & = \frac{(\omega^{-r} + \omega^{-2r} + \ldots + \omega^{-(n-1)r} )(-1)^k + G(1) } { n } \\ & = \frac{ (-1) ( -1)^k + (n-1)^k } { n} \end{array} $$

Since this is independent of $r$, hence it is a constant (subject to $r \neq 0$).

In particular, for Mike's case where $n=4,k=3$, we have $G_r = \frac{1 + 3^3 } { 4 } = 7$, which agrees with his total count.

Bonus: Applying the roots of unity filter for $r= 0$,
$$ \begin{array} { l l } G_0 & = \frac{ G(\omega) + G(\omega) + \ldots + G(1) } { n } \\ & = \frac{(n-1 )(-1)^k + G(1) } { n } \\ & = \frac{ (n-1) ( -1)^k + (n-1)^k } { n} \end{array} $$

Hence , $G_0 = G_1 + (-1)^k$.

In particular, they are never equal.
Note: If we simply assumed the result, then since $ (n-1)^k = G_0 + (n-1) G_1$ and $ n\not \mid (n-1)^k$ , hence we could have concluded that $ G_ 0 \neq G_1$.

Calvin Lin
  • 77,541
2

You say that ideally you want to construct a bijection, so that is what I will do. Letting $X(n,k,r)$ be the set of solutions to $x_1+\dots+x_k\equiv r\pmod n$ where all variables are nonzero, I will give a bijection between $X(n,k,r)$ and $X(n,k,r')$ whenever $r,r'\in \{1,\dots,n-1\}$.

Consider the correspondence mapping the vector $(x_1,x_2,\dots,x_k)$ to its list of $k+1$ partial sums modulo $n$, where we include the empty partial sum at the beginning. $$ \begin{align} (x_1,x_2,\dots,x_k)&\mapsto (s_0,s_1,s_2,\dots,s_k),\text{ where} \\ s_0&=0, \\ s_i&\equiv x_1+\dots+x_i\pmod n\quad (1\le i\le k) \end{align} $$

Let $S(n,k,r)$ denote the image of $X(n,k,r)$ under this correspondence. Even though the bijection between $X(n,k,r)$ and $X(n,k,r')$ is hard to find, the bijection between $S(n,k,r)$ and $S(n,k,r')$ is simple to describe. Note that $S(n,k,r)$ consists of sequences of length $k+1$, starting with zero, where each pair of adjacent entries has two different numbers, and such that the last entry is $r$.

Given $(s_0,s_1,\dots,s_k)\in S(n,k,r)$, let $z$ be the greatest index for which $s_z=0$. We will modify only the entries $s_{z+1},\dots,s_{k}$ of the input, which are all nonzero. Follow these steps:

  • Let $t_j=s_j-1$, for each $j\in \{z+1,\dots,k\}$. Interpret $t_j$ as an element of $\mathbb Z/(n-1)\mathbb Z$.
  • Let $t'_j=t_j+(r'-r)\pmod{n-1}$, for each $j\in \{z+1,\dots,k\}$,
  • Finally, interpret $t_j'$ as an integer, and let $s_j'=t_j'+1$.

Then the vector $(s_0,\dots,s_z,s_{z+1}',\dots,s_k')$ is in $S(n,k,r')$.

Here is an example of the bijection in the case where $n=4,k=3$, $r=1$ and $r'=2$.

$$ \begin{array}{cccc} X(4,3,1) & S(4,3,1) & \text{Result of bijection}\in S(4,3,2) & X(4,3,2) \\\hline (1,1,3) & (0,1,2,1) & (0,\color{blue}{2,3,2}) & (2,1,3) \\(1,2,2) & (0,1,3,1) & (0,\color{blue}{2,1,2}) & (2,3,1) \\(1,3,1) & (0,1,0,1) & (0,1,0,\color{blue}2) & (1,3,2) \\(2,1,2) & (0,2,3,1) & (0,\color{blue}{3,1,2}) & (3,2,1) \\(2,2,1) & (0,2,0,1) & (0,2,0,\color{blue}{2}) & (2,2,2) \\(3,1,1) & (0,3,0,1) & (0,3,0,\color{blue}{2}) & (3,1,2) \\(3,3,3) & (0,3,2,1) & (0,\color{blue}{1,3,2}) & (1,2,3) \end{array} $$

Mike Earnest
  • 84,902
  • 1
    1/ Pointing out that we require $ z < k$ which is why the argument doesn't work for $r = 0 $ (and it shouldn't). $\quad$ 2/ Your map of $s_j \rightarrow s_j'$ is better (more simply) described as conditioning on $s_j \neq n-1$. – Calvin Lin Oct 25 '24 at 09:51
2

First, we establish the result by induction.
Note that we do require $ r, r' \neq 0$, esp since otherwise the statement might not be true like I pointed out in the comments.

Clearly the base case of $k = 1$ is true, since there's exactly 1 way to write it.

Now, we want to show that the number of ways to write $x_1 + x_2 + \ldots x_{k+1} \equiv r$ is the same as the number of ways to write $x_1' + x_2' + \ldots x_{k+1}' \equiv r'$. Condition on $x_{1}$.

  • If $x_{1} \not \equiv r - r'$, then we map $f(x_1, x_2, \ldots x_k ) = (x_1 + (r'-r), x_2, \ldots x_k)$.
    • Note that we do not map to $x_1' = r'-r$
    • This is a bijection between these cases, so the number of solutions is the same.
  • If $x_{1} = r - r'$, then set $x_1' = r'-r$ (which we didn't map to above).
    • Via the induction hypothesis, the number of solutions to $x_2 + x_3 + \ldots + x_k \equiv r - (r-r') ) \equiv r' $ is the same as the number of solutions to $x_2^* + x_3^* + \ldots + x_k ^* \equiv r' - (r'-r) \equiv r $.

Hence the number of solutions is the same, and we are done.


Second, we completely describe the bijection, by considering what happens in the induced bijection.

  • If $x_{1} \neq r - r'$: then $x_1' = x_1 + r'-r$ and the rest stay the same.
  • If $x_1 = r-r', x_2 \neq r'-r$: then $x_1' = r'-r, x_2' = x_2 + r - r'$ and the rest stay the same.
  • If $x_1 = r-r', x_2 = r'-r, x_3 \neq r-r'$: then $x_1' = r'-r, x_2' = r-r', x_3' = x_3 + r'-r$ and the rest stay the same.
  • Continue through the end.

Here is an example of the bijection using Mike's case where $n=4,k=3$, $r=1$ and $r'=2$.

$$ \begin{array}{c l l } X(4,3,1) & X(4,3,2) & \\\hline (1,1,3) & (2,1,3) & \text{Add 1 to the 1st term} \\(1,2,2) & (2,2,2) & \text{Add 1 to the 1st term} \\(1,3,1) & (2,3,1) & \text{Add 1 to the 1st term} \\(2,1,2) & (3,1,2) & \text{Add 1 to the 1st term} \\(2,2,1) & (3,2,1) & \text{Add 1 to the 1st term} \\ (3,1,1) & (1,3,2) & \text{Set $x_1' = 1, x_2' = 3$, Add 1 to the 3rd term} \\ (3,3,3) & (1,2,3) & \text{Set $x_1' = 1$, subtract 1 from the 2nd term}\\ \end{array} $$

Calvin Lin
  • 77,541
0

If we count all solutions, including $x_i=0$, then the number is clearly $$n^{k-1}\tag1$$ as there is a unique solution for any choice of the first $k-1$ coordinates.

Then the number of solutions excluding $x_i=0$ is given by the inclusion–exclusion principle as $$\sum_{I\subseteq[k]}(-1)^{|I|}n^{k-|I|-1}=\frac1n\sum_{i=0}^k\binom ki(-1)^in^{k-i}=\frac{(n-1)^k}n.\tag2$$ Except that this can’t be right, as it is not even an integer. So what went wrong?

Well, (1) is only valid for $k>0$: for $k=0$, the number of solutions is $$\begin{cases}1,&r=0,\\0,&r\ne0\end{cases}$$ rather than “$1/n$”. This changes the $I=[k]$ term in (2)—and consequently the end result—by $\pm1/n$ (for $r\ne0$) or $\pm(1-1/n)$ (for $r=0$), with sign depending on the parity of $k$. The actual result is thus $$\begin{cases}\dfrac{(n-1)^k-(-1)^k}n,&r\ne0,\\[1.5ex] \dfrac{(n-1)^k+(-1)^k(n-1)}n,&r=0.\end{cases}\tag3$$

-1

$x_1, \dots, x_{k-1}$ can be anything, then simply set $x_k$ to the value to bring the sum to $r$ modulo $n$. The logic applies whether $r$ is zero or any other number. Therefore the number of solutions is the same regardless of $r$, so long as $0 \le r < n$ of course.

EDIT: This assumed $0 \le x_i < n$ instead of $0 < x_i < n$, nevermind this answer.

Snared
  • 1,068
  • 4
  • 17
  • I'm not sure I follow the logic. How does this prove that the number of solutions is the same (including $r=0$)? Actually, the number of solutions I got for $r=0$ is different from the rest. – George Giapitzakis Oct 25 '24 at 02:20
  • The proof shows the count is $n^{k-1}$, regardless of $r$, no? – Snared Oct 25 '24 at 02:31
  • 2
    well okay, didn't realize you excluded zero, well that changes things, my bad. I thought it was $0 \le x_i < n$, nevermind my answer. – Snared Oct 25 '24 at 02:32