For questions about proving and using Jensen's inequality for convex functions. To be used necessarily with the [inequality] tag.
Jensen's inequality states that for a convex function $f$, for $\lambda\in[0,1]$, we have $$f\left[\lambda x+(1-\lambda)y\right]\leq\lambda f(x)+(1-\lambda)f(y).$$ In the context of measure-theoretic probability theory, Jensen's inequality states that given a probability space $(\Omega,\mathcal F,\mathbb P)$, given a $\mathbb P$-integrable function $f$ and convex function $\psi$, then $$\psi\left(\int_\Omega f\,\mathrm d\mathbb P\right)\leq\int_\Omega\psi( f)\mathrm d\mathbb P.$$
Jensen's inequality is sometimes written in terms of the expectation operator, i.e. if $X$ is a random variable and $\psi$ is a convex function as above, then $$\psi(\mathbb E[X])\leq\mathbb E[\psi(X)].$$ It is a broad generalization of the fact that variance is non-negative (i.e. that $\mathbb E(X^2) \le (\mathbb E X)^2$) with many consequences. For example, it gives one way to prove the AM-GM inequality.
It also has uses in combinatorics (via e.g. the discrete version: $ \sum_i\alpha_i=1,\alpha_i\ge 0$ implies $\psi(\sum_i\alpha_i x_i) \le \sum_i \alpha_i \psi(x_i)$), real analysis, harmonic analysis, and geometry.
External links: Wikipedia page on Jensen's inequality