They gave you the hand-wavy proof above. I am giving you the high-end proof now.
Recall from basic differential calculus:
Basic differentiation algebra: the derivative acts linearly $(f+ \alpha g)'(x) = f'(x) + \alpha g'(x)$; the derivative of constant functions is zero; and the derivative of continuous linear functions are themselves $f'(x) \cdot h = f(h)$ whenever $f$ is linear and continuous.
Chain rule: if $g$ and $f$ are two functions defined on open subsets of normed vector spaces such that $f$ is differentiable at $x$ and $g$ is differentiable at $f(x)$ then the composite function $g \circ f$ is differentiable at $c$ and its derivative is the composite of the derivatives $$(g \circ f)'(x) = g'(f(x)) \circ f'(x).$$
Abridged proof. Write $y = f(x)$ and $f(x + h) = f(x) + \underbrace{f'(x) \cdot h + o(h)}_k$ and $$g(y + k) = g(y) + g'(y) k + o(k) = g(y) + g'(y) \cdot f'(x) \cdot h + \underbrace{g'(y) o(h) + o(k)}_{o(k)}. \square$$
To your exercise. The function $H$ is differentiable at every $U$ where the function $F$ is differentiable as well.
Proof. The functions $\varphi:V \mapsto GW V^\intercal$ and $\psi = U \mapsto SU^\intercal$ are linear while the function $U \mapsto C$ is contant. Therefore, the function $H = \varphi \circ F + \psi + C$ will be differentiable at all points where $F$ is differentiable (by the chain rule) and its derivative is simply $$H'(U) = \varphi'(U) \circ F'(U)^\intercal + \psi'(U) = \varphi \circ F'(U) + \psi.$$
If you are dealing with finite dimensional vector spaces, find bases of each so that (by denoting $[ \cdot ]$ the matrix represantion) we get $$[H'(U)]=[\varphi][F'(U)]^\intercal + [\psi]. \square$$
Ammend. If the function $\varphi$ is invertible then the differentiability of $H$ implies that of $F$ for we can write $F = \varphi^{-1} \circ (H - \psi - C),$ and $\varphi^{-1}$ being linear and continuous, it is differentiable. $\square$