Let $(g_{ij})$ denote the usual metric on $\mathbb{R}^{n+1}$. Consider the parameterization of $\mathbb{R}^{n+1}\setminus\{0\}$ given by $\vec{x}:\mathbb{R}^+\times\mathbb{S}^n\to\mathbb{R}^{n+1}$, $\vec{x}(r,\sigma)=r\sigma$. In these polar coordinates, we have that
$$(g_{ij})=\renewcommand{\arraystretch}{1.5}
\left(\begin{array}{@{}c|c@{}} \,\,1 &
\begin{matrix} 0 & \cdots & 0 \end{matrix} \\ \hline
\begin{matrix} 0 \\ \vdots \\ 0 \end{matrix} &
\begin{matrix} r^2\left(g_{ij}^{\mathbb{S}^n}\right) \end{matrix}
\end{array}\right),$$
where $\left(g_{ij}^{\mathbb{S}^n}\right)$ is the metric on $\mathbb{S}^{n}\subseteq\mathbb{R}^{n+1}.$ You should show this if you haven't shown this before! Note that the inverse matrix is given by
$$(g^{ij})=\renewcommand{\arraystretch}{1.5}
\left(\begin{array}{@{}c|c@{}} \,\,1 &
\begin{matrix} 0 & \cdots & 0 \end{matrix} \\ \hline
\begin{matrix} 0 \\ \vdots \\ 0 \end{matrix} &
\begin{matrix} \frac{1}{r^2}\left(g^{ij}_{\mathbb{S}^n}\right) \end{matrix}
\end{array}\right),$$
where $\left(g^{ij}_{\mathbb{S}^n}\right)$ is the inverse of the metric of $\mathbb{S}^n$.
Now recall that the gradient is defined as $\nabla f = (df)^\sharp$. In local coordinates, that is $$\nabla f = g^{ij}\frac{\partial f}{\partial x^j}\frac{\partial}{\partial x^i},$$
where in the above I used the Einstein summation notation. Now we compute the gradient of $\mathbb{R}^{n+1}$ in polar coordinates:
\begin{align*}
\nabla_{\mathbb{R}^{n+1}}f = \frac{\partial f}{\partial r}\frac{\partial}{\partial r}+\frac{1}{r^2}\sum_{i=1}^{n}\sum_{j=1}^{n}g^{ij}_{\mathbb{S}^n}\frac{\partial f}{\partial \theta_j}\frac{\partial}{\partial\theta_i} = \frac{\partial f}{\partial r}\frac{\partial}{\partial r} + \nabla_{\mathbb{S}^n(r)}f.
\end{align*}
Note that my result is different from yours by a factor of $\frac{1}{r^2}$, this is because the surface gradient on $\mathbb{S}^n(r)$ differs from the surface gradient on $\mathbb{S}^n(1)$ by this factor. If we were to write the surface gradient in coordinates, and normalize our tangent vectors, we would obtain the exact same representation you have in your question.