I came across this question myself as well, and while the original answer gave an equation for the derivative, it does not give the condition for the existence of this derivative. To address this we use the implicit function theorem.
For all $t \in \mathbb{R}$, define $f_t: \mathbb{R} \times \mathbb{R} \to \mathbb{R}$ by
$$f_t(\varepsilon, s) = \varphi_\varepsilon(s) - t$$
Note that $\varphi_\varepsilon^{-1}(t)$ satisfies
$$f_t(\varepsilon, \varphi_\varepsilon^{-1}(t)) = 0$$
Given some $\varepsilon_0 \in \mathbb{R}$, suppose $f_t$ is continuously differentiable and $\partial_s f_t(\varepsilon_0, \varphi_{\varepsilon_0}^{-1}(t))$ is non-zero. Note that of course this requires that the partial derivative $\partial_\varepsilon \varphi_\varepsilon$ exist. Then by the implicit function theorem, there exists an open set $U \subset \mathbb{R}$ containing $\varepsilon_0$ and a unique, continuously differentiable function $g_t: U \to \mathbb{R}$ such that
$$g_t(\varepsilon_0) = \varphi_{\varepsilon_0}^{-1}(t)$$
$$f_t(\varepsilon, g_t(\varepsilon)) = 0$$
for all $\varepsilon \in U$, with derivative given by
$$g'_t(\varepsilon) = -\frac{\partial_\varepsilon f_t(\varepsilon, g_t(\varepsilon))}{\partial_s f_t(\varepsilon, g_t(\varepsilon))} = -\frac{\partial_\varepsilon \varphi_\varepsilon(g_t(\varepsilon))}{\partial_s \varphi_\varepsilon(g_t(\varepsilon))}$$
In particular, plugging in $\varepsilon = \varepsilon_0$ we have
$$g'_t(\varepsilon_0) = -\frac{\partial_\varepsilon \varphi_\varepsilon(\varphi_{\varepsilon_0}^{-1}(t))}{\partial_s \varphi_\varepsilon(\varphi_{\varepsilon_0}^{-1}(t))}$$
which is the same equation as in the original answer, but with precise conditions on its existence.