52 Things: Number 9: What are Shannon's definitions of entropy and information?

52 Things: Number 9: What are Shannon's definitions of entropy and information?

52件事:第9件:香农对熵和信息的定义是什么?

 
This is the latest in a series of blog posts to address the list of '52 Things Every PhD Student Should Know To Do Cryptography': a set of questions compiled to give PhD candidates a sense of what they should know by the end of their first year. This blog post tries to introduce some fundamental concepts in Information Theory: What are Shannon's definitions of entropy and information?
这是一系列博客文章中的最新一篇,旨在解决“每个博士生在做密码学时应该知道的52件事”:这是一组问题,旨在让博士生在第一年结束时了解他们应该知道什么。这篇博客文章试图介绍信息论中的一些基本概念:香农对熵和信息的定义是什么?
 
 
Information Theory was founded by Claude E. Shannon in 1948. It was originally developed to study signal processing but its application has been broadened to various disciplines through decades. This blog is intended to briefly introduce two fundamental concepts, entropy and information. If you are interested in more details, I personally suggest you to find more in [1].
信息理论由克劳德·香农于1948年创立。它最初是为了研究信号处理而开发的,但几十年来,它的应用已扩展到各个学科。本博客旨在简要介绍熵和信息这两个基本概念。如果你对更多细节感兴趣,我个人建议你在[1]中找到更多。
 
 
Entropy 
 
Entropy is a measurement to evaluate the uncertainty[3] of one or more variables.
熵是一种评估一个或多个变量的不确定性[3]的测量方法。
 
Assume we are investigating the first web page people visit when they open a web browser. We use samples from two groups of people: 4 cryptographer from Bristol Cryptogroup and 4 passenger grabbed at Bristol coach station. Let's make a more ambitious assumption that the cryptographers always visit http://bristolcrypto.blogspot.co.uk/ first when they open a web browser, while the others each visits a different portal.
假设我们正在调查人们打开网络浏览器时访问的第一个网页。我们使用了两组人的样本:4名来自Bristol Cryptogroup的密码学家和4名在Bristol长途汽车站被抓的乘客。让我们做一个更雄心勃勃的假设,密码学家总是访问http://bristolcrypto.blogspot.co.uk/首先,当他们打开网络浏览器时,其他人则各自访问不同的门户网站。
 
Now let's evaluate the uncertainty of their answers: apparently, the answers from cryptographers are quite certain (low uncertainty) whilst it can hardly be guessed if an answer is from a passenger (high uncertainty). In other words, we  say the answers in the cryptographers group has a low entropy but  that in the passengers group has a high entropy.
现在,让我们评估他们答案的不确定性:显然,密码学家的答案是非常确定的(低不确定性),而很难猜测答案是否来自乘客(高不确定性)。换句话说,我们说密码学家组中的答案具有低熵,而乘客组中的回答具有高熵。
 
So one of the most remarkable inventions of Shannon's is his definition of entropy(Shannon's Entropy):
因此,香农最显著的发明之一是他对熵的定义(香农熵):

H=ipilogbpi


where pi is the probability of a message (an answer in previous example) appears. In computer science, we usually use b=2 (bits).
其中 pi 是出现消息(前面示例中的答案)的概率。在计算机科学中,我们通常使用 b=2 (位)。

If we compute the entropy in our example, we will have:
如果我们在我们的例子中计算熵,我们将得到:


Hcryptographer=411log21=0Hpassenger=4114log214=2

 
So the passengers' answer do have a higher entropy than the cryptographers'!
因此,乘客的答案确实比密码学家的答案具有更高的熵!
 
Information 信息
 
Formally, the definition of Shannon's information is given in [2] as:
形式上,Shannon信息的定义在[2]中给出为:
 
    " Information is a measure of one's freedom of choice when one selects a message."
“信息是衡量一个人在选择信息时的选择自由度。”
 
To explain this, let's make a minor modification to our previous example. Let's grab another 4 passengers from Bristol train station and assume their answers are also random portals just like the passengers' in coach station.
为了解释这一点,让我们对前面的例子进行一个小的修改。让我们从布里斯托尔火车站再找4名乘客,假设他们的答案也是随机的,就像长途汽车站的乘客一样。
 
Here is the question: Given an answer y, can you tell which group the answer is from?
问题是:给定一个答案 y ,你能说出答案来自哪一组吗?
 
We can instantly tell that the answer is from our cryptographer group if y is http://bristolcrypto.blogspot.co.uk/. However, we will be struggling if y is a portal; therefore, we could say the message of http://bristolcrypto.blogspot.co.uk/ contains more information (less freedom) than a portal (more freedom).
如果 y 是,我们可以立即判断出答案来自我们的密码学家组http://bristolcrypto.blogspot.co.uk/.然而,如果 y 是一个门户网站,我们将举步维艰;因此,我们可以说的信息http://bristolcrypto.blogspot.co.uk/包含的信息(自由度较低)比门户网站(自由度较高)多。
 
So how does it relate to entropy?
那么它和熵有什么关系呢?
 
Extending the definition of entropy, we define the Conditional Entropy as:
扩展熵的定义,我们将条件熵定义为:
 
H(Y|X)=xXp(x)H(Y|X=x)
 
which describes the entropy of Y when given condition X=x. To make it more explicitly, since entropy is the uncertainty of a variable; hence the previous definition of conditional entropy is in fact the uncertainty of Y when given the "clue"(condition) X.
其描述当给定条件 X=x 时 Y 的熵。更明确地说,因为熵是一个变量的不确定性;因此,先前条件熵的定义实际上是当给出“线索”(条件)#3时#2的不确定性。
 
Observation: consider two variables X and Y. If X contains only a minimal information of Y, then given an exact value of X should not help us much on deducing the value of Y, that is, it does not obviously reduce the uncertainty of Y; on the other hand, if X contains essential information of Y, then the entropy of Y is expected to be low when X is given. Therefore, the conditional entropy can be viewed as a rational measurement to the information X has of Y!
观察:考虑两个变量 X 和 Y 。如果#2只包含#3的最小信息,那么给定#4的精确值对我们推导 Y 的值没有太大帮助,也就是说,它不会明显降低 Y 的不确定性;另一方面,如果 X 包含 Y 的基本信息,则当给出 X 时, Y 的熵预计较低。因此,条件熵可以被视为 X 具有 Y 的信息的合理度量!
 
Another important measurement is called Mutual Information. It is a measure of mutual dependency among two variables. One way to define it is the loss of entropy(uncertainty) when given a condition:
另一个重要的衡量标准是相互信息。它是两个变量之间相互依赖性的度量。定义它的一种方法是在给定条件时的熵损失(不确定性):

I(X;Y)=H(X)H(X|Y)=H(Y)H(Y|X)
 
 
 
Cryptographic Example 加密示例
The concepts of information theory is widely used in cryptography. A classic example is to view a cryptographic process as a channel with plaintext and ciphertext being input and output respectively. Research of side channel analysis also benefits from the usage of information theory.
信息论的概念在密码学中得到了广泛的应用。一个经典的例子是将加密过程视为分别输入和输出明文和密文的通道。侧通道分析的研究也得益于信息论的应用。
posted @ 2024-04-11 12:49  3cH0_Nu1L  阅读(22)  评论(0编辑  收藏  举报