I have tried solving this problem for a while now and hard luck.
Let $F$ be the set of all continuous functions $f:\mathbb{R} \rightarrow \mathbb{R}$ such that $$e^{f(x)}+f(x)\geq x+1, \forall x\in \mathbb{R}$$
If $$I(f)=\int_{0}^{e}f(x)dx$$ what is the minimum value of $I(f),f\in F?$
From the given inequality isolating $f(x)$ and integrating, $$\begin{align} \int_{0}^{e}f(x)dx &\geq\int_0^e(x+1-e^{f(x)})dx \\ &=\frac{e^2}{2}+e-\int_0^ee^{f(x)}dx \end{align}$$ And minimum (or our answer) will be just when equality occurs. This has two issues though both sides are dependent on $f(x)$ so I can't just maximize that integral on RHS. Even if I could, it could be as big as possible and then RHS is $-\infty?!$ and I totally ignored the first inequality here.
Also, as the range of both the RHS and LHS in the given inequality is $(-\infty,\infty)$ maybe there is a bijective map so that $e^{f(x)}+f(x)= x+1...$not sure if that is useful either. We can't even say anything about the monotonicity of $f$ as well. Please can someone help?