The constant in the Berry-Esseen theorem:
If we have a bunch of i.i.d. random variables $(X_j)_{j\geq 1}$ with a finite third moment, that is $E[|X_j|^3]<\infty$ (and thus they also have some mean $\mu$ and variance $\sigma^2$), then we can prove without too much trouble that their scaled average, $A_n := \frac{(\sum_{j=1}^n X_j)-n\mu}{\sigma \sqrt{n}}$ converges in distribution to the standard normal distribution $\mathcal{N}(0,1)$. This is just a weak version of the Central Limit Theorem. (The first few sections of this Wikipedia article give a list of CLTs that assume less)
Denote $A_n$'s cumulative distribution function as $G_n(x)$, and call the standard normal's CDF $\Phi(x).$
We can prove that for any sample size $n$, then $$\sup_x |G_n(x)-\Phi(x)|\leq c \cdot \frac{E\{|X|^3\}}{\sigma^3\sqrt{n}},$$
for some universal $c>0.$
As of 2012 we know $c < 0.4748$ (at the time the book was published it was $c<0.7975$), but that's our best guess.
The CLT tells us that $A_n$ will get close to the normal distribution, eventually as $n$ gets large. The Berry-Esseen theorem tells us how close it is guaranteed to get, for any $n$ you specify.
Or at least, it's supposed to, we just don't know precisely how well. This is interesting because a strange constant no one has seen before pops out of seemingly nowhere, given only routine conditions.