This is a partial answer:
The CDF of an uniform random variable on $(0,1)$ is given by
$$
\begin{cases} 0 \text{ if } x<0\\ x \text{ if } 0 \le x\le 1\\
1 \text{ otherwise}\end{cases}
$$ So the CDF of the minimum of $n$ random variables is given by
$$
G(x)=\begin{cases} 0 \text{ if } x<0\\ 1-(1-x)^n \text{ if } 0 \le x\le 1\\
1 \text{ otherwise}\end{cases}
$$
The pdf of the minimum is given by
$$
g(x)= \begin{cases} n(1-x)^{n-1} \text{ if } 0\le x \le 1\\ 0 \text{ otherwise} \end{cases}
$$
The characteristic function is given by
$$
\phi(t)= n \int_0^1 e^{itx} (1-x)^{n-1} dx= n! \dfrac{e^{it}}{(it)^n} \left(1- e^{-it} \sum_{k=0}^{n-1} \frac{(it)^k}{k!}\right)
$$
Let call $S_\ell=\sum_{i=1}^\ell X_i$, where $X_i$ is the r.v. with CDF $G(x)$. As the random variables are iid, it is possible to obtain the density function of this R.V. as
$$
s_\ell(x)= \mathcal{F}^{-1} (\phi(t)^\ell)
$$
(I didn't check, but i think it can be possible to solve this integral with countour integration as the $\phi(t)^l$ as a pole of order $ln$ at the origin)
Another possible approach : By the properties of the Fourier transform you obtain that
$$
s_\ell(x)= g *_\ell g(x)
$$
where $*_\ell$ is the convolution with itself $\ell$ times.
As in this article, the probability that the sum of $\ell$ uniformly random variable to be grater than one is given by
$$
P^{(1)}_\ell= \int_1^\ell s_{\ell}(x)- \int_1^{\ell-1} s_{\ell-1}(x)
$$
and the average will be given by
$$
E^1= \sum_{\ell=1}^\infty \ell P^{(1)}_\ell
$$