Questions tagged [entropy]

This tag is for questions about mathematical entropy. If you have a question about thermodynamical entropy, visit Physics Stack Exchange or Chemistry Stack Exchange instead.

This tag is for questions about mathematical entropy, not to be confused with thermodynamical entropy which goes in Physics or Chemistry Stack Exchange.

1414 questions
150
votes
15 answers

Intuitive explanation of entropy

I have bumped many times into entropy, but it has never been clear for me why we use this formula: If $X$ is random variable then its entropy is: $$H(X) = -\displaystyle\sum_{x} p(x)\log p(x).$$ Why are we using this formula? Where did this formula…
jjepsuomi
  • 8,135
  • 12
  • 49
  • 89
33
votes
4 answers

Shannon entropy of a fair dice

The formula for Shannon entropy is as follows, $$\text{Entropy}(S) = - \sum_i p_i \log_2 p_i $$ Thus, a fair six sided dice should have the entropy, $$- \sum_{i=1}^6 \dfrac{1}{6} \log_2 \dfrac{1}{6} = \log_2 (6) = 2.5849...$$ However, the entropy…
26
votes
3 answers

An information theory inequality which relates to Shannon Entropy

For $a_1,...,a_n,b_1,...,b_n>0,\quad$ define $a:=\sum a_i,\ b:=\sum b_i,\ s:=\sum \sqrt{a_ib_i}$. Is the following inequality true?: $${\frac{\Bigl(\prod a_i^{a_i}\Bigr)^\frac1a}a \cdot \frac{\left(\prod b_i^{b_i}\right)^\frac1b}b…
Amir Parvardi
  • 4,788
  • 2
  • 23
  • 61
24
votes
3 answers

Is Standard Deviation the same as Entropy?

We know that standard deviation (SD) represents the level of dispersion of a distribution. Thus a distribution with only one value (e.g., 1,1,1,1) has SD equals to zero. Similarly, such a distribution requires little information to be defined. On…
23
votes
1 answer

Entropy of a binomial distribution

How do we get the functional form for the entropy of a binomial distribution? Do we use Stirling's approximation? According to Wikipedia, the entropy is: $$\frac1 2 \log_2 \big( 2\pi e\, np(1-p) \big) + O \left( \frac{1}{n} \right)$$ As of now,…
user844541
  • 1,513
  • 3
  • 13
  • 26
21
votes
2 answers

Can the entropy of a random variable with countably many outcomes be infinite?

Consider a random variable $X$ taking values over $\mathbb{N}$. Let $\mathbb{P}(X = i) = p_i$ for $i \in \mathbb{N}$. The entropy of $X$ is defined by $$H(X) = \sum_i -p_i \log p_i.$$ Is it possible for $H(X)$ to be infinite?
VSJ
  • 1,031
  • 7
  • 17
21
votes
1 answer

What are differences and relationship between shannon entropy and fisher information?

When I first got into information theory, information was measured or based on shannon entropy or in other words, most books I read before were talked about shannon entropy. Today someone told me there is another information called fisher…
20
votes
4 answers

Estimating the entropy

Given a discrete random variable $X$, I would like to estimate the entropy of $Y=f(X)$ by sampling. I can sample uniformly from $X$. The samples are just random vectors of length $n$ where the entries are $0$ or $1$. For each sample vector $x_i$, I…
user66307
20
votes
1 answer

Best way to play 20 questions

Background You and I are going to play a game. To start off with I play a measurable function $f_1$ and you respond with a real number $y_1$ (possibly infinite). We repeat this some fixed number $N$ of times, to obtain a collection…
18
votes
2 answers

How Entropy scales with sample size

For a discrete probability distribution, the entropy is defined as: $$H(p) = \sum_i p(x_i) \log(p(x_i))$$ I'm trying to use the entropy as a measure of how "flat / noisy" vs. "peaked" a distribution is, where smaller entropy corresponds to more…
nbubis
  • 31,733
  • 7
  • 76
  • 133
18
votes
1 answer

Entropy of a uniform distribution

The entropy of a uniform distribution is $ ln(b-a)$. With $a=0$ and $b=1$ this reduces to zero. How come there is no uncertainty?
log2
  • 193
  • 1
  • 1
  • 5
17
votes
2 answers

Relative entropy for martingale measures

I need some help understanding a note given in a lot of papers I've read. Let $(\Omega,\mathcal{F},P)$ be a complete probability Space, $\mathbb{F} = (\mathcal{F}_t)_{t\in[0,T]}$ a given filtration with usual conditions, $S$ be a locally bounded…
17
votes
1 answer

At what rate does the entropy of shuffled cards converge?

Consider a somewhat primitive method of shuffling a stack of $n$ cards: In every step, take the top card and insert it at a uniformly randomly selected one of the $n$ possible positions above, between or below the remaining $n-1$ cards. Start with a…
joriki
  • 215,929
  • 14
  • 263
  • 474
17
votes
5 answers

Is there a "most random" state in Rubik's cube?

Is there a state in Rubik's cube which can be considered to have the highest degree of randomness (maximum entropy?) asssuming that the solved Rubik's cube has the lowest?
kunjan kshetri
  • 470
  • 1
  • 9
  • 19
16
votes
2 answers

Non-zero Conditional Differential Entropy between a random variable and a function of it

Let two continuous random variables, where the one is a function of the other: $X\, $ and $\, Y=g\left(X\right)$. Their mutual information is defined…
Alecos Papadopoulos
  • 9,930
  • 1
  • 24
  • 42
1
2 3
94 95