Questions tagged [entropy]
This tag is for questions about mathematical entropy. If you have a question about thermodynamical entropy, visit Physics Stack Exchange or Chemistry Stack Exchange instead.
1,653
questions
0
votes
0
answers
53
views
Max conditional entropy
Random vectors $(X,Y)$ are distributed over $\mathbb{R}^n \times \mathbb{R}^m$ with zero mean and covariance $\Sigma \in \mathbb{R}^{(n+m)\times (n+m)}_+.$
What distributions attain the maximum value ...
0
votes
0
answers
22
views
Approximation of entropy of binomial distribution
The approximation of entropy of binomial distribution is:
$$\frac1 2 \log_2 \big( 2\pi e\, np(1-p) \big) + O \left( \frac{1}{n} \right)$$
Based on my understanding, this approximation is for large n ...
0
votes
0
answers
40
views
Converge of iterated average posterior to high entropy distribution
Setup
Assume $p_Y \in \Delta^n$ is a discrete probability distribution obtained by $p_Y=L_{Y|X}p_X$, where $L_{Y|X} \in \mathbb{R}^{n \times m}$ is an arbitrary likelihood (i.e, a column stochastic ...
0
votes
0
answers
26
views
Shannon's finite state transducer (FST) entropy theorem
I am trying to make sense of the proof of Shannon's theorem that a finite state transducer cannot increase the entropy of its input.
I would love some sort of drawing or intuitive formulation of it, ...
1
vote
1
answer
38
views
Doubts on "An Intensive Introduction to Cryptography" exercise about Shannon's entropy
I was going through the exercises in An Intensive Introduction to Cryptography (see full PDF here), and in particular, I had some doubts on Exercise 0.12 (found on page 42). Here is the relevant ...
3
votes
1
answer
93
views
Approximating the Prime Counting Function as $\pi(x) \approx \frac{x^2}{\ln\left(\Gamma(x+1)\right)}$
Approximating the Prime Counting Function as $\boxed{\pi(x) \approx \frac{x^2}{\ln\left(\Gamma(x+1)\right)}}$
Intro________________
In a unrelated topic I was viewing how the mechanical statistics ...
0
votes
0
answers
8
views
Is a concave parametric curve along a concave surface guaranteed to be concave along another concave surface?
Take a parametrized probability distribution $\mathbf{p}(\theta)=( p_0(\theta),p_1(\theta),\cdots p_n(\theta))$ and two permutation-symmetric, everywhere-concave functions $S_1(\mathbf{p})$ and $S_2(\...
1
vote
2
answers
77
views
Von Neumann entropy vs Shannon entropy
Let us consider a mixture of quantum states
$$
\rho = \sum p_{i}\left\vert \psi_i\right\rangle \left\langle\psi_i\right\vert\quad
\mbox{probability distribution}\,\,\, p_{i}
$$
If the $\psi_{i}$ form ...
0
votes
1
answer
58
views
Is the entropy of a distribution that follows the exponential family differential equation always concave in the natural parameter?
Are there any proofs, conjectures, counterexamples, or other helpful references related to the titular question?
To clarify, let the entropy of a random variable $X$ distributed according to a ...
4
votes
2
answers
293
views
Prove that $p = 1/2$ maximizes the entropy of a binomial distribution. [closed]
I am trying to find a proof that the entropy of a Binomial distribution is maximized at $p = 1/2$. By symmetry, I can easily show that $p = 1/2$ is a local optimum, but I'm stuck trying to show that $...
0
votes
0
answers
18
views
How can I derive equation 2.23(Newton-Raphson for Entropy) in NASA CEA analysis
It might sound a bit basic, but, I'd like to follow NASA CEA report I. analysis from the beginning.
So, I have to derive the Newton-Raphson equation from the entropy equation, which is one of the ...
0
votes
0
answers
17
views
KL Divergence larger than Conditional KL Divergence
Let $q(z|y)$ and $r(z)$ be variational approximations of $p(z|y)$ and $p(z)$, respectively. If I know that $H(q(z|y))=H(r(z))$, where $H$ is the entropy, I'd like to know if it is true that:
\begin{...
-2
votes
1
answer
56
views
How to understand the calculations for the density matrices of a qubit state [closed]
From a lecture, I wrote this set of calculations for the entropy of a given qubit state.
What I am unsure of, since it was not shown at the lecture, is how the lecturer came to each of these matrices.
...
0
votes
0
answers
29
views
Diffusion PM: Remove the edge effect at t = 0
In DPM paper appendix B.2 for proving that the $\int d\mathrm{x}^{(0)} d\mathrm{x}^{(1)} q(\mathrm{x}^{(0)}, \mathrm{x}^{(1)}) log [\displaystyle \frac{\pi(\mathrm{x}^{(0)})}{\pi(\mathrm{x}^{(1)})}] $...
0
votes
2
answers
66
views
Differentiating Entropy with respect to Convolution Parameters
First, some formula reminders for the sake of completion:
$H(X) = -\sum_{i} p(x_i) \log p(x_i)$ is the entropy of a sequence $x_i$, where $p(x)$ is the discrete probability of x.
A discrete ...