Questions tagged [entropy]

121 questions
39
votes
7 answers

Can PRNGs be used to magically compress stuff?

This idea occurred to me as a kid learning to program and on first encountering PRNG's. I still don't know how realistic it is, but now there's stack exchange. Here's a 14 year-old's scheme for an amazing compression algorithm: Take a PRNG and seed…
36
votes
6 answers

Do lossless compression algorithms reduce entropy?

According to Wikipedia: Shannon's entropy measures the information contained in a message as opposed to the portion of the message that is determined (or predictable). Examples of the latter include redundancy in language structure or statistical…
robert
  • 463
  • 4
  • 8
32
votes
7 answers

Is there a connection between the halting problem and thermodynamic entropy?

Alan Turing proposed a model for a machine (the Turing Machine, TM) which computes (numbers, functions, etc.) and proved the Halting Theorem. A TM is an abstract concept of a machine (or engine if you like). The Halting Theorem is an impossibility…
Nikos M.
  • 1,016
  • 7
  • 16
31
votes
2 answers

Simulating a probability of 1 of 2^N with less than N random bits

Say I need to simulate the following discrete distribution: $$ P(X = k) = \begin{cases} \frac{1}{2^N}, & \text{if $k = 1$} \\ 1 - \frac{1}{2^N}, & \text{if $k = 0$} \end{cases} $$ The most obvious way is to draw $N$ random bits and check if all of…
26
votes
12 answers

Is von Neumann's randomness in sin quote no longer applicable?

Some chap said the following: Anyone who attempts to generate random numbers by deterministic means is, of course, living in a state of sin. That's always taken to mean that you can't generate true random numbers with just a computer. And he said…
Paul Uszak
  • 1,602
  • 1
  • 13
  • 21
22
votes
2 answers

How does an operating system create entropy for random seeds?

On Linux, the files /dev/random and /dev/urandom files are the blocking and non-blocking (respectively) sources of pseudo-random bytes. They can be read as normal files: $ hexdump /dev/random 0000000 28eb d9e7 44bb 1ac9 d06f b943 f904 8ffa 0000010…
19
votes
2 answers

What's harder: Shuffling a sorted deck or sorting a shuffled one?

You have an array of $n$ distinct elements. You have access to a comparator (a black box function taking two elements $a$ and $b$ and returning true iff $a < b$) and a truly random source of bits (a black box function taking no arguments and…
15
votes
3 answers

Shannon Entropy of 0.922, 3 Distinct Values

Given a string of values $AAAAAAAABC$, the Shannon Entropy in log base $2$ comes to $0.922$. From what I understand, in base $2$ the Shannon Entropy rounded up is the minimum number of bits in binary to represent a single one of the values. Taken…
12
votes
2 answers

Is there a generalization of Huffman Coding to Arithmetic coding?

In trying to understand the relationships between Huffman Coding, Arithmetic Coding, and Range Coding, I began to think of the shortcomings of Huffman coding to be related to the problem of fractional bit-packing. That is, suppose you have 240…
12
votes
4 answers

Compression of Random Data is Impossible?

A few days ago this appeared on HN http://www.patrickcraig.co.uk/other/compression.htm. This refers to a challenge from 2001 - where someone was offering a prize of \$5000 for any kind of reduction to the size of randomly generated data (the…
user3467349
  • 387
  • 2
  • 12
10
votes
1 answer

How to practically measure entropy of a file?

I'm trying to measure now much non redundant (actual) information my file contains. Some call this the amount of entropy. Of course there is the standard p(x) log{p(x)}, but I think that Shannon was only considering it from the point of view of…
Paul Uszak
  • 1,602
  • 1
  • 13
  • 21
10
votes
1 answer

Constrainted Optimization Problem in Matrix Entropy

I have a constrainted optimization problem in the (Shannon) matrix entropy $\mathtt{(sum(entr(eig(A))))}$. The matrix $A$ can be written as the sum of rank 1 matrices of the form $[v_i\,v_i^T]$ where $v_i$ is a given normalized vector. The…
Dries
  • 103
  • 6
9
votes
1 answer

Rényi entropy at infinity or min-entropy

I'm reading a paper that refers to the limit as n goes to infinity of Rényi entropy. It defines it as ${{H}_{n}}\left( X \right)=\dfrac{1}{1-n} \log_2 \left( \sum\limits_{i=1}^{N}{p_{i}^{n}} \right)$. It then says that the limit as $n\to \infty $ is…
Mitchell Kaplan
  • 475
  • 3
  • 8
7
votes
2 answers

How best to statistically verify random numbers?

Lets say I have 1000 bytes that are supposedly random. I want to verify to a certain certainty that they are indeed random and evenly distributed across all byte values. Aside from calculating the standard deviation and mean value, what are my…
Mr. Negi
  • 73
  • 4
7
votes
2 answers

Compressing normally distributed data

Given normally distributed integers with a mean of 0 and a standard deviation $\sigma$ around 1000, how do I compress those numbers (almost) perfectly? Given the entropy of the Gaussian distribution, it should be possible to store any value $x$…
1
2 3
8 9