Assume we have a function
$$H(x) = -\sum_{i=1}^n x_i \log x_i$$ where $0 < x_i < 1$ $\forall i$ (this is the Shannon entropy if you are familiar with it).
I am reading a paper in which the authors stated the following approximation $$H(x) = -\sum_{i=1}^n x_i \log x_i \approx \sum_{i=1}^n x_i (1-x_i)$$ Can anyone tell me the intuition behind this approximation?