Questions tagged [entropy]

This tag is for questions about mathematical entropy. If you have a question about thermodynamical entropy, visit Physics Stack Exchange or Chemistry Stack Exchange instead.

This tag is for questions about mathematical entropy, not to be confused with thermodynamical entropy which goes in Physics or Chemistry Stack Exchange.

1694 questions
174
votes
17 answers

Intuitive explanation of entropy

I have bumped many times into entropy, but it has never been clear for me why we use this formula: If $X$ is random variable then its entropy is: $$H(X) = -\displaystyle\sum_{x} p(x)\log p(x).$$ Why are we using this formula? Where did this formula…
jjepsuomi
  • 8,979
35
votes
4 answers

Shannon entropy of a fair dice

The formula for Shannon entropy is as follows, $$\text{Entropy}(S) = - \sum_i p_i \log_2 p_i $$ Thus, a fair six sided dice should have the entropy, $$- \sum_{i=1}^6 \dfrac{1}{6} \log_2 \dfrac{1}{6} = \log_2 (6) = 2.5849...$$ However, the entropy…
28
votes
2 answers

What are differences and relationship between shannon entropy and fisher information?

When I first got into information theory, information was measured or based on shannon entropy or in other words, most books I read before were talked about shannon entropy. Today someone told me there is another information called fisher…
27
votes
3 answers

An information theory inequality which relates to Shannon Entropy

For $a_1,...,a_n,b_1,...,b_n>0,\quad$ define $a:=\sum a_i,\ b:=\sum b_i,\ s:=\sum \sqrt{a_ib_i}$. Is the following inequality true?: $${\frac{\Bigl(\prod a_i^{a_i}\Bigr)^\frac1a}a \cdot \frac{\left(\prod b_i^{b_i}\right)^\frac1b}b…
26
votes
4 answers

Is Standard Deviation the same as Entropy?

We know that standard deviation (SD) represents the level of dispersion of a distribution. Thus a distribution with only one value (e.g., 1,1,1,1) has SD equals to zero. Similarly, such a distribution requires little information to be defined. On…
26
votes
2 answers

Can the entropy of a random variable with countably many outcomes be infinite?

Consider a random variable $X$ taking values over $\mathbb{N}$. Let $\mathbb{P}(X = i) = p_i$ for $i \in \mathbb{N}$. The entropy of $X$ is defined by $$H(X) = \sum_i -p_i \log p_i.$$ Is it possible for $H(X)$ to be infinite?
VSJ
  • 1,091
26
votes
1 answer

Entropy of a binomial distribution

How do we get the functional form for the entropy of a binomial distribution? Do we use Stirling's approximation? According to Wikipedia, the entropy is: $$\frac1 2 \log_2 \big( 2\pi e\, np(1-p) \big) + O \left( \frac{1}{n} \right)$$ As of now,…
user844541
  • 1,583
  • 3
  • 14
  • 29
25
votes
1 answer

Entropy of a uniform distribution

The entropy of a uniform distribution is $ ln(b-a)$. With $a=0$ and $b=1$ this reduces to zero. How come there is no uncertainty?
log2
  • 263
  • 1
  • 3
  • 5
23
votes
5 answers

The entropy of entropy (or how to fix an unfair die)

$\newcommand{\on}[1]{\operatorname{#1}}$ I have recently noticed this behavior: Let $\on{P}$ be a discrete probability distribution $$ \on{P} = \left\{p_{1},\ldots, p_{n} \right\}\ \mbox{where}\ p_{1} + \cdots + p_{n} = 1,\quad p_{i} > 0\ \forall\…
23
votes
2 answers

How Entropy scales with sample size

For a discrete probability distribution, the entropy is defined as: $$H(p) = \sum_i p(x_i) \log(p(x_i))$$ I'm trying to use the entropy as a measure of how "flat / noisy" vs. "peaked" a distribution is, where smaller entropy corresponds to more…
21
votes
4 answers

Estimating the entropy

Given a discrete random variable $X$, I would like to estimate the entropy of $Y=f(X)$ by sampling. I can sample uniformly from $X$. The samples are just random vectors of length $n$ where the entries are $0$ or $1$. For each sample vector $x_i$, I…
user66307
21
votes
1 answer

Best way to play 20 questions

Background You and I are going to play a game. To start off with I play a measurable function $f_1$ and you respond with a real number $y_1$ (possibly infinite). We repeat this some fixed number $N$ of times, to obtain a collection…
19
votes
9 answers

(Elegant) proof of : $x \log_2\frac{1}{x}+(1-x) \log_2\frac{1}{1-x} \geq 1- (1-\frac{x}{1-x})^2$

I am looking for the most concise and elegant proof of the following inequality: $$ h(x) \geq 1- \left(1-\frac{x}{1-x}\right)^2, \qquad \forall x\in(0,1) $$ where $h(x) = x \log_2\frac{1}{x}+(1-x) \log_2\frac{1}{1-x}$ is the binary entropy function.…
18
votes
2 answers

Inequality involving entropies: $\left \|p -\frac{1}{n} e \right \|_2\ge\left \|\frac{-p\log p}{H(p)} -\frac{1}{n} e \right \|_2$

For a given probability vector $p=(p_1,\dots,p_n)$ with $p_1,\dots,p_n > 0, \sum_{i=1}^n p_i=1$ and with $e:= (1, \dots, 1)$, I want to prove the following inequality: $$\small\left \|p -\frac{1}{n} e \right \|^2_2=\sum_{i=1}^n \left (p_i…
Amir
  • 11,124
18
votes
6 answers

Why can't I compress an encrypted file?

Let's say I have a txt file, called harry_potter.txt. I can easily compress it with any compression algorithm. So the entropy of the file is "smaller" than its size on the disk. But if I encrypt the file with AES-256-CBC or AES-256-BCB (I used…
Yuxuan Lu
  • 293
1
2 3
99 100