Where entropy is some measure of the degree of randomness/disorder in a given set of numbers: $S = \{a_1, a_2, ..., a_i\}$
For example, the set $S_{high} = \{4,0,2,5,8,3,7,2,5\}$ has a high degree of randomness/disorder.
And the set $S_{low} = \{4,4,4,4,5,5,5,5,5\}$ has a low degree of randomness/disorder.
I am aware of information entropy $IE$, which applies to probability distributions (and quantifies the amount of information, which is related to randomness/disorder, contained in a probability distribution):
$$IE = \sum p_i log(\frac{1}{p_i})$$
However, I simply have numbers. Although I can take these numbers and convert them to an empirical probability distribution as:
$S_{low} = \{4,4,4,4,5,5,5,5,5\} $
$\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\space\downarrow$

The process/function by which one would do so (convert numbers to an empirical probability distribution so that the $IE$ formula above can be applied) is not differentiable (at least it seems that way to me).
So I wonder, is there any differentiable function that can take a set of raw numbers, and approximate the "entropy" of those numbers in the sense described above?