I've found a basically "computation-free"/"elbow-grease-free" proof of the Newton inequalities, which in particular prove the Maclaurin inequalities (cut-the-knot link), which are a direct generalization of AM-GM. See the Appendix for definitions/notation! I wrote it a bit verbosely, but once you get the 2 conceptual ingredients, I think it's super easy to remember.
Newton's inequality. Fix a set $X=X_n=\left\{x_1, x_2, \ldots, x_n\right\}$ of non-negative numbers; then, Newton's inequality (for $n$ variables) is
$$\boxed{ \left(E_k(X)\right)^2\geq E_{k-1}(X) E_{k+1}(X), \qquad 1 \leq k \leq n-1 }$$
which is moreover strict if all the $x_i$ are distinct.
Proof (a simplification of this AoPS proof of Newton's inequality): first, basically by definition, the $k$th symmetric average $E_k(x_1,\ldots, x_n)$ is the $t^{n-k}$ coefficient of the polynomial $p(x):=(t+x_1)\cdots (t+x_n)$ --- denoted $e_k(x_1,\ldots, x_n)$, the $k$th symmetric sum --- divided by $\binom nk$.
Let us abuse notation and also use $E_k(p), e_k(p)$ to denote the $k$th symmetric average/sum of the roots of the (monic real-rooted) polynomial $p$.
Observation 1 (from AoPS link): obviously by the power rule for derivatives ($p'$ is real-rooted by Rolle's theorem, of if you want to
be fancy, Gauss-Lucas; and $\frac 1n p'$ is monic), $e_k(\frac 1n p') = \frac 1n (n-k) e_k(p)$ for $0\leq k \leq n-1$. Therefore,
$$E_k(p') = \frac{e_k(p')}{\binom{n-1}k}= \frac{\frac 1n(n-k) e_k(p)}{\frac{n-k}{n}\cdot \binom{n}k} = E_k(p).$$ So amazingly,
derivatives preserve symmetric averages!
We're tantalizingly close; assuming (by induction) the Newton inequalities for $n-1$ variables, the above "differentiation reduction" proves the Newton inequalities (for $n$ variables) for $1 \leq k \leq n-2$.
I.e. only the case $k=n-1$ remains. If one of the $x_i$ is $0$, the $k=n-1$ inequality is trivially true (since the RHS of the $k=n-1$ inequality would have a factor of $E_n(X)=0$), so let's assume all $x_i\neq 0$.
Observation 2: consider the reciprocal roots $\frac{1}{\vec x}:= \frac 1{x_1},\ldots, \frac{1}{x_n}$ instead of $\vec x:= x_1,\ldots, x_n$. It is easy to see that
$$E_k(\frac 1{x_1},\ldots, \frac{1}{x_n})\cdot \underbrace{E_n(x_1,\ldots, x_n)}_{=x_1\cdots x_n}= E_{n-k}(x_1,\ldots, x_n).$$
(The same is true for $e$ instead of $E$.) This implies that any homogeneous inequality involving $E_k(\vec x)$ (or $e_k(\vec x)$) is EQUIVALENT to the same homogeneous inequality with $k \mapsto n-k$ and $\vec x \mapsto \frac 1{\vec x}$ swapped. In particular,
$$E_k(\vec x)^2 \geq E_{k-1}(\vec x)E_{k+1}(\vec x) \iff E_{n-k}(\frac1{\vec x})^2 \geq E_{n-k+1}(\frac1{\vec x})E_{n-k-1}(\frac1{\vec x}).$$
So indeed the final $k=n-1$ case follows from the induction hypothesis ($k=1$) on the (derivative of the) "reciprocal polynomial" (indeed, $k=1$ is covered by the induction hypothesis since recall induction covered from $1 \leq k \leq n-2$).
Amusingly, this induction is basically identical to the induction in the famous "All horses are the same color" pseudoproof. To make ours legitimate, we actually need the base case to be $n=2$.
But of course, this is the basic 2-variable AM-GM (for $x_1,x_2 \geq 0$)!
$$(\frac{x_1+x_2}2)^2=:E_1(x_1,x_2)^2\geq E_0(x_1,x_2)E_2(x_1,x_2) := 1 \cdot x_1x_2.$$
(Which is strict if $x_1 \neq x_2$)
How amazing! The only algebra we need is 2-variable AM-GM (basically trivial), Observation 1 (derivative power rule, and 1 small manipulation of binomial coefficients), and Observation 2 (basically trivial), and we magically, "effortlessly" bootstrap ourselves all the way up to the Newton inequalities, which are vastly more powerful than $n$-variable AM-GM!
Appendix:
For sake of completeness, I'll quote the relevant parts of the above cut-the-knot link:
Defining symmetric averages
For a set $X_n=\left\{x_1, x_2, \ldots, x_n\right\}$ of $n$ variables, denote by $X_n^k$ the set of all products of the $n$ variables taken $k$ at a time. There are $\binom{n}{k}$ such products:
$$ \left|X_n^k\right|=\binom{n}{k} $$
The symmetric functions in $n$ variables are defined as
$$ \begin{aligned} e_0\left(x_1, x_2, \ldots, x_n\right) & =1 \\
e_1\left(x_1, x_2, \ldots, x_n\right) & =\sum_{t \in X_n^1}
t=x_1+x_2+\ldots+x_n \\ e_2\left(x_1, x_2, \ldots, x_n\right) &=\sum_{t \in X_n^2} t=x_1 x_2+x_1 x_3+\ldots+x_{n-1} x_n \\ & \cdots \cdots \\ e_n\left(x_1, x_2, \ldots, x_n\right) & =\sum_{t \in X_n^n} t=x_1 x_2 \cdots x_n \end{aligned} $$
and their averages as
$$ \begin{aligned} & E_0(X)=E_0\left(x_1, x_2, \ldots,
x_n\right)=\frac{1}{\binom{n}{0}} e_0\left(x_1, x_2, \ldots,
x_n\right), \\ & E_1(X)=E_1\left(x_1, x_2, \ldots,
x_n\right)=\frac{1}{\binom{n}{1}} e_1\left(x_1, x_2, \ldots,
x_n\right), \\ & E_2(X)=E_0\left(x_1, x_2, \ldots,
x_n\right)=\frac{1}{\binom{n}{2}} e_0\left(x_1, x_2, \ldots,
x_n\right), \\ & E_n(X)=E_n\left(x_1, x_2, \ldots,
x_n\right)=\frac{1}{\binom{n}{n}} e_n\left(x_1, x_2, \ldots,
x_n\right) . \end{aligned} $$
Statement of Newton and Maclaurin
To formulate both Newton's inequality and Maclaurin's inequality we assume that all $x_i$ ($i=1,\ldots, n$) are non-negative.
Newton's inequality
$$\boxed{ \left(E_k(X)\right)^2>E_{k-1}(X) E_{k+1}(X), \qquad 1 \leq k \leq n-1 }$$
Maclaurin's inequality
$$
\boxed{E_1(X)>\left(E_2(X)\right)^{\frac{1}{2}}\left(E_3(X)\right)^{\frac{1}{3}}>\ldots>\left(E_n(X)\right)^{\frac{1}{n}}}
$$
The inequalities become equalities only when all $x_i$ are equal. Note
that $E_1(X)>\left(E_n(X)\right)^{\frac{1}{n}}$ is none other than the
better known AM-GM inequality.
Newton $\implies$ Maclaurin
Indeed, multiply Newton's inequalities in a sequence, as below:
$$ \left(E_0 E_2\right)\left(E_1 E_3\right)^2\left(E_2 E_3\right)^3
> \cdots\left(E_{k-1}
> E_{k+1}\right)^k<\left(E_1\right)^2\left(E_2\right)^4
> \cdots\left(E_k\right)^{2 k} $$
which reduces to $\left(E_k\right)^{k+1}>\left(E_{k+1}\right)^k$, or
$\left(E_k\right)^{\frac{1}{k}}>\left(E_{k+1}\right)^{\frac{1}{k+1}}$.