Skip to main content

Questions tagged [entropy]

This tag is for questions about mathematical entropy. If you have a question about thermodynamical entropy, visit Physics Stack Exchange or Chemistry Stack Exchange instead.

0 votes
0 answers
53 views

Max conditional entropy

Random vectors $(X,Y)$ are distributed over $\mathbb{R}^n \times \mathbb{R}^m$ with zero mean and covariance $\Sigma \in \mathbb{R}^{(n+m)\times (n+m)}_+.$ What distributions attain the maximum value ...
Christian Chapman's user avatar
0 votes
0 answers
22 views

Approximation of entropy of binomial distribution

The approximation of entropy of binomial distribution is: $$\frac1 2 \log_2 \big( 2\pi e\, np(1-p) \big) + O \left( \frac{1}{n} \right)$$ Based on my understanding, this approximation is for large n ...
MarcG's user avatar
  • 11
0 votes
0 answers
40 views

Converge of iterated average posterior to high entropy distribution

Setup Assume $p_Y \in \Delta^n$ is a discrete probability distribution obtained by $p_Y=L_{Y|X}p_X$, where $L_{Y|X} \in \mathbb{R}^{n \times m}$ is an arbitrary likelihood (i.e, a column stochastic ...
backboltz37's user avatar
0 votes
0 answers
26 views

Shannon's finite state transducer (FST) entropy theorem

I am trying to make sense of the proof of Shannon's theorem that a finite state transducer cannot increase the entropy of its input. I would love some sort of drawing or intuitive formulation of it, ...
iuerlhgw's user avatar
1 vote
1 answer
38 views

Doubts on "An Intensive Introduction to Cryptography" exercise about Shannon's entropy

I was going through the exercises in An Intensive Introduction to Cryptography (see full PDF here), and in particular, I had some doubts on Exercise 0.12 (found on page 42). Here is the relevant ...
chirpyboat73's user avatar
3 votes
1 answer
93 views

Approximating the Prime Counting Function as $\pi(x) \approx \frac{x^2}{\ln\left(\Gamma(x+1)\right)}$

Approximating the Prime Counting Function as $\boxed{\pi(x) \approx \frac{x^2}{\ln\left(\Gamma(x+1)\right)}}$ Intro________________ In a unrelated topic I was viewing how the mechanical statistics ...
Joako's user avatar
  • 1,558
0 votes
0 answers
8 views

Is a concave parametric curve along a concave surface guaranteed to be concave along another concave surface?

Take a parametrized probability distribution $\mathbf{p}(\theta)=( p_0(\theta),p_1(\theta),\cdots p_n(\theta))$ and two permutation-symmetric, everywhere-concave functions $S_1(\mathbf{p})$ and $S_2(\...
Quantum Mechanic's user avatar
1 vote
2 answers
77 views

Von Neumann entropy vs Shannon entropy

Let us consider a mixture of quantum states $$ \rho = \sum p_{i}\left\vert \psi_i\right\rangle \left\langle\psi_i\right\vert\quad \mbox{probability distribution}\,\,\, p_{i} $$ If the $\psi_{i}$ form ...
Jip's user avatar
  • 21
0 votes
1 answer
58 views

Is the entropy of a distribution that follows the exponential family differential equation always concave in the natural parameter?

Are there any proofs, conjectures, counterexamples, or other helpful references related to the titular question? To clarify, let the entropy of a random variable $X$ distributed according to a ...
nlupugla's user avatar
4 votes
2 answers
293 views

Prove that $p = 1/2$ maximizes the entropy of a binomial distribution. [closed]

I am trying to find a proof that the entropy of a Binomial distribution is maximized at $p = 1/2$. By symmetry, I can easily show that $p = 1/2$ is a local optimum, but I'm stuck trying to show that $...
nlupugla's user avatar
0 votes
0 answers
18 views

How can I derive equation 2.23(Newton-Raphson for Entropy) in NASA CEA analysis

It might sound a bit basic, but, I'd like to follow NASA CEA report I. analysis from the beginning. So, I have to derive the Newton-Raphson equation from the entropy equation, which is one of the ...
dave's user avatar
  • 27
0 votes
0 answers
17 views

KL Divergence larger than Conditional KL Divergence

Let $q(z|y)$ and $r(z)$ be variational approximations of $p(z|y)$ and $p(z)$, respectively. If I know that $H(q(z|y))=H(r(z))$, where $H$ is the entropy, I'd like to know if it is true that: \begin{...
AAA's user avatar
  • 13
-2 votes
1 answer
56 views

How to understand the calculations for the density matrices of a qubit state [closed]

From a lecture, I wrote this set of calculations for the entropy of a given qubit state. What I am unsure of, since it was not shown at the lecture, is how the lecturer came to each of these matrices. ...
Superunknown's user avatar
  • 2,973
0 votes
0 answers
29 views

Diffusion PM: Remove the edge effect at t = 0

In DPM paper appendix B.2 for proving that the $\int d\mathrm{x}^{(0)} d\mathrm{x}^{(1)} q(\mathrm{x}^{(0)}, \mathrm{x}^{(1)}) log [\displaystyle \frac{\pi(\mathrm{x}^{(0)})}{\pi(\mathrm{x}^{(1)})}] $...
Zohreh Adabi's user avatar
0 votes
2 answers
66 views

Differentiating Entropy with respect to Convolution Parameters

First, some formula reminders for the sake of completion: $H(X) = -\sum_{i} p(x_i) \log p(x_i)$ is the entropy of a sequence $x_i$, where $p(x)$ is the discrete probability of x. A discrete ...
2 False's user avatar
  • 65

15 30 50 per page
1
2 3 4 5
111