This is Exercise 10 from section 1.10 of Braun's book on differential equations (3rd edition).
Show a solution $y(t)$ of the given initial value problem exists on the specified interval: $$y^\prime(t)=y+e^{-y}+e^{-t}$$ $$y(0)=0$$ $$0\leq t\leq 1$$
The function $f(t,y) = y + e^{-y} + e^{-t}$ is continuous and $\frac{\partial f}{\partial y}$ is continuous, both on all of $\mathbb{R}^2$. We are supposed to find a rectangle $R$ of the form $0 \leq t \leq a$, $|y| \leq b$, then set $M = \max\{|f(t,y)| \colon (t,y) \in R\}$. Then a solution exists on the interval $0 \leq t \leq \alpha$, where $\alpha = \min(a,b/M)$.
The problem is that $M = -b + e^b + 1$, attained at the corner $(t,y)=(0,-b)$ of $R$. So $b/M = \frac{b}{e^b-b+1}$. We want this to be $\geq 1$: $\frac{b}{e^b - b + 1} \geq 1$, or $2b -1 \geq e^b$. Well, that never happens: $2b-1 < e^b$ for all $b$.