For any positive integer $n$,
$$ a^n + b^n = a^n + (( a+b) + (-a))^n = a^n(1 + (-1)^n) + (a + b)^n + \sum_{j = 1}^{n-1} {n \choose j} (a + b)^j (-a)^{n-j}$$
CASE I: $n$ is an odd prime.
The $a^n$ term vanishes as $n$ is odd.
$$ a^n + b^n = (a + b) \left[ (a + b)^{n-1} + \sum_{j=1}^{n-1}{n \choose j}(a + b)^{j-1}(-a)^{n-j}\right]$$
Dividing throughout by $a + b$,
$$ \frac{a^n + b^n}{a+b} = (a + b)^{n-1} + {n \choose 1}(-a)^{n-1} + \sum_{j=2}^{n-1}{n \choose j}(a+b)^{j-1}(-a)^{n-j}$$
Since $n-1$ is even and ${n \choose 1} = n$, the above expression can be simplified to:
$$ \frac{a^n + b^n}{ a + b } = na^{n-1} + (a + b)\left[ (a+b)^{n-2} + \sum_{j=2}^{n-1} {n \choose j}(a + b)^{j-2}(-a)^{n-j} \right]$$
Let us denote $Q_n = \left[ (a+b)^{n-2} + \sum_{j=2}^{n-1} {n \choose j}(a + b)^{j-2}(-a)^{n-j} \right]$ for simplicity. As a sum of integers, $Q_n \in \mathbb{Z}$ as well.
Thus we get the result
\begin{equation}
\frac{a^n + b^n}{a + b} = na^{n-1} + Q_n(a+b)
\end{equation}
Where it should be noted that $n$ is an odd prime.
With this result in hand, we have
$$ \left(\frac{a^n + b^n}{a + b}, a+b \right) = \left(na^{n-1} + Q_n(a+b), a+b \right) = \left(na^{n-1}, a+b \right)$$
As $(a, b) = (a, b+ax)$ for any integer $x$.
Denote $\left(na^{n-1}, a+b \right) = d$. Assume $d \neq 1$. Then $d$ must have a prime factor, say, $p$. Since $p | d$ and by definition $d | a+b$ and $d | na^{n-1}$, we conclude that $ p | a + b$ and $ p | na^{n-1}$.
The latter expression implies that $p|n$ or $p|a$ (or both), since $p$ is prime.
If $p | a$, then as $p | a + b$, $p | b$ as well, which is not possible given $(a, b) = 1$. Hence $p \nmid a$.
Thus, $p | n$. However, as both $p, n$ are prime, $p = n$. Hence, $n | a+b$.
Thus we have found that
$$d \neq 1 \implies n | a+b$$
This is logically equivalent to saying that $d = 1$ unless $n | a+b$.
CASE II: $n = 2$, that is, $n$ is an even prime.
The argument here follows similar lines as above.
$$ \frac{a^2 + b^2}{ a + b } = \frac{(a+b)^2 - 2ab}{a+b} = (a+b) + \frac{(-2)ab}{a+b}$$
As before, we can write $$ \left(\frac{a^2 + b^2}{a+b}, a + b\right) = \left((a + b) + \frac{(-2)ab}{a+b}, a + b\right) = \left(\frac{2ab}{a+b}, a + b\right)$$
Let us denote $\left(\frac{2ab}{a+b}, a + b\right) = e$. Assume that $e \neq 1$. Then there exists a prime factor of $e$, denoted by $q$.
As before, we can conclude that the above statements imply $q | a+b$ and $q | \frac{2ab}{a + b}$. This is equivalent to saying that there exist integers $\alpha, \beta \in \mathbb{Z}$ such that $$ a+b = \alpha q$$ and $$ \frac{2ab}{a + b} = \beta q $$
From this we can conclude that $q^2 | 2ab$, which implies that $q | 2ab$. Since $q$ is prime, $q | 2$, or $q | ab$, or both. If $q | ab$, then $q | a$ or $q | b$ or both. If $q | a$, then since $q | a+b$, $q | b$ as well. Similarly $q | b \implies q|a$, and hence, we conclude that if $q | ab$ then $q | a$ and $q | b$. This is not possible as $(a, b) = 1$. Hence, $q \nmid ab$. Thus, $q | 2$. However as $q$ is prime, we must have $q = 2$. We can thus conclude that $$ e \neq 1 \implies 2 | a+b$$
This is logically equivalent to saying that $e = 1$ unless $2 | a+b$.
Hence we are done. $\blacksquare$