TransWikia.com

What is the relationship between these two definitions for the max-entropy?

Quantum Computing Asked on April 6, 2021

On Wikipedia, the max-entropy for classical systems is defined as

$$H_{0}(A)_{rho}=log operatorname{rank}left(rho_{A}right)$$

The term max-entropy in quantum information is reserved for the following definition

$$H_{max }(A)_{rho}=2 cdot log operatorname{tr}left[rho_{A}^{1 / 2}right]$$

While these are just definitions, they go by the same name so is there a relationship between them?

What I know

The only thing I managed to prove was that $H_0(A)_rho geq H_{max}(A)_rho$. The proof is below. Let $lambda_i$ be the eigenvalues of $rho_A$ and $r$ be the rank of $rho_A$. We have

begin{align}
H_{max}(A)_rho &= 2log(lambda_1^{1/2} + .. + lambda_n^{1/2})
&leq 2log left(frac{1}{r^{1/2}}cdot rright)
&=H_0
end{align}

Is there perhaps a reverse version of this inequality

$$H_{max}(A)_rhogeq H_0(A)_rho + text{something}$$

which would justify using the same name for both quantities?

One Answer

The term max-entropy in quantum information is reserved for the following definition

No it's not, many papers like https://arxiv.org/abs/0803.2770 use the term to refer to the quantity $log mathrm{rank}(rho)$. Your first definition comes from the Rényi entropy of order 0, while the second one comes from the Rényi entropy of order $frac{1}{2}$, and you should always check which one the authors are referring to.

Correct answer by user13507 on April 6, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP