-1

I am aware of this question:

$\gcd\left(a+b,\frac{a^p+b^p}{a+b}\right)=1$, or $p$

However, this question does not require us to prove that the GCD must be $p$, and hence, answers to this question would not necessarily fit there, and vice versa. Moreover, I found the existing solutions to related old questions confusing and unclear, which is why I decided to make my own post regarding this question.

2 Answers2

0

For any positive integer $n$,

$$ a^n + b^n = a^n + (( a+b) + (-a))^n = a^n(1 + (-1)^n) + (a + b)^n + \sum_{j = 1}^{n-1} {n \choose j} (a + b)^j (-a)^{n-j}$$

CASE I: $n$ is an odd prime.

The $a^n$ term vanishes as $n$ is odd. $$ a^n + b^n = (a + b) \left[ (a + b)^{n-1} + \sum_{j=1}^{n-1}{n \choose j}(a + b)^{j-1}(-a)^{n-j}\right]$$ Dividing throughout by $a + b$,

$$ \frac{a^n + b^n}{a+b} = (a + b)^{n-1} + {n \choose 1}(-a)^{n-1} + \sum_{j=2}^{n-1}{n \choose j}(a+b)^{j-1}(-a)^{n-j}$$

Since $n-1$ is even and ${n \choose 1} = n$, the above expression can be simplified to:

$$ \frac{a^n + b^n}{ a + b } = na^{n-1} + (a + b)\left[ (a+b)^{n-2} + \sum_{j=2}^{n-1} {n \choose j}(a + b)^{j-2}(-a)^{n-j} \right]$$

Let us denote $Q_n = \left[ (a+b)^{n-2} + \sum_{j=2}^{n-1} {n \choose j}(a + b)^{j-2}(-a)^{n-j} \right]$ for simplicity. As a sum of integers, $Q_n \in \mathbb{Z}$ as well.

Thus we get the result

\begin{equation} \frac{a^n + b^n}{a + b} = na^{n-1} + Q_n(a+b) \end{equation}

Where it should be noted that $n$ is an odd prime.

With this result in hand, we have

$$ \left(\frac{a^n + b^n}{a + b}, a+b \right) = \left(na^{n-1} + Q_n(a+b), a+b \right) = \left(na^{n-1}, a+b \right)$$

As $(a, b) = (a, b+ax)$ for any integer $x$.

Denote $\left(na^{n-1}, a+b \right) = d$. Assume $d \neq 1$. Then $d$ must have a prime factor, say, $p$. Since $p | d$ and by definition $d | a+b$ and $d | na^{n-1}$, we conclude that $ p | a + b$ and $ p | na^{n-1}$. The latter expression implies that $p|n$ or $p|a$ (or both), since $p$ is prime.

If $p | a$, then as $p | a + b$, $p | b$ as well, which is not possible given $(a, b) = 1$. Hence $p \nmid a$.

Thus, $p | n$. However, as both $p, n$ are prime, $p = n$. Hence, $n | a+b$.

Thus we have found that

$$d \neq 1 \implies n | a+b$$

This is logically equivalent to saying that $d = 1$ unless $n | a+b$.

CASE II: $n = 2$, that is, $n$ is an even prime.

The argument here follows similar lines as above.

$$ \frac{a^2 + b^2}{ a + b } = \frac{(a+b)^2 - 2ab}{a+b} = (a+b) + \frac{(-2)ab}{a+b}$$

As before, we can write $$ \left(\frac{a^2 + b^2}{a+b}, a + b\right) = \left((a + b) + \frac{(-2)ab}{a+b}, a + b\right) = \left(\frac{2ab}{a+b}, a + b\right)$$

Let us denote $\left(\frac{2ab}{a+b}, a + b\right) = e$. Assume that $e \neq 1$. Then there exists a prime factor of $e$, denoted by $q$.

As before, we can conclude that the above statements imply $q | a+b$ and $q | \frac{2ab}{a + b}$. This is equivalent to saying that there exist integers $\alpha, \beta \in \mathbb{Z}$ such that $$ a+b = \alpha q$$ and $$ \frac{2ab}{a + b} = \beta q $$

From this we can conclude that $q^2 | 2ab$, which implies that $q | 2ab$. Since $q$ is prime, $q | 2$, or $q | ab$, or both. If $q | ab$, then $q | a$ or $q | b$ or both. If $q | a$, then since $q | a+b$, $q | b$ as well. Similarly $q | b \implies q|a$, and hence, we conclude that if $q | ab$ then $q | a$ and $q | b$. This is not possible as $(a, b) = 1$. Hence, $q \nmid ab$. Thus, $q | 2$. However as $q$ is prime, we must have $q = 2$. We can thus conclude that $$ e \neq 1 \implies 2 | a+b$$

This is logically equivalent to saying that $e = 1$ unless $2 | a+b$.

Hence we are done. $\blacksquare$

0

Here's a relatively standard solution using LTE. Suppose $p$ is a prime that divides $a+b$. In particular; if $n\nmid a+b$ then $p\neq n$. Notice that since $a,b$ are coprime, and $p\mid a+b$, then $p\nmid a,b$. Therefore, by the LTE we have $$v_p\left(\frac{a^n+b^n}{a+b}\right)=v_p(a^n+b^n)-v_p(a+b)=v_p(n)=0$$ this implies that any prime that divides $a+b$ does not divide $a^n+b^n$. i.e. they don't share any factors. If $n\mid a+b$ and $n\neq 2$ then $$v_n\left(\frac{a^n+b^n}{a+b}\right)=v_n(a^n+b^n)-v_n(a+b)=v_n(n)=1$$ and so $n$ divides both $a+b$ and $\frac{a^n+b^n}{a+b}$ showing that they have a common factor. If $n=2$ and $n\mid a+b$ then the statement is actually false (if you understand it as everytime $n\mid a+b$ then they have a common factor). Since taking $a=b=1$ and $n=2$ we notice that $1$ and $2$ don't share any common factors. It actually fails even more miserably. If $2\mid a+b$ and they are coprime, then both $a,b$ are odd. As discussed in the other solution; we have $$\gcd\left(\frac{a^2+b^2}{a+b},a+b\right)=\gcd\left(\frac{2ab}{a+b},a+b\right)$$

then, for any prime $p\neq 2$ that divides $a+b$ we must have $p\nmid a,b$ since it clearly can divide at most one; as they're coprime; and if it only divides one of them; it doesn't divide the sum. Now $v_p(2ab)=0$, so in particular $2ab$ is not even divisible by $a+b$. So the only way this is an integer, is if $a+b$ is a power of $2$, buit in this case, since both $a,b$ are odd, the only way $2ab$ is divisible by a bower of $2$ is if this power of $2$ is actually $2$. So when $n=2$ the only case where the fraction is an integer is with $a=b=1$ which we already saw fails the statement (interpeted as an iff).