Elements of Information Theory: Exercises
Elements of Information Theory: Exercises
2.2 Entropy of functions.
Let
(a)
(b)
2.6 Conditional mutual information vs. unconditional mutual information.
Give examples of joint random variables
(a)
(b)
(a) Let
Something counter intuition is that I may interpret
Interpretation by ChatGPT
In the case where
, the mutual information is zero because the two random variables and are completely determined by the value of the random variable . In other words, knowing the value of gives us complete information about both and , so there is no additional information to be gained by considering and together. This means that the two random variables are independent given the value of , so .
(b) let
2.28 Mixing increases entropy.
Show that the entropy of the probability distribution,
2.30 Maximum entropy.
Find the probability mass function
for a fixed value
Sol. We tackle this problem by using Lagrange Multipliers
where
which becomes
we can rewrite
to simplify the notation, let
so the rest of the restrictions becomes
which implies that,
So the entropy maximizing distribution is,
Plugging these values into the expression for the maximum entropy,
2.33 Fano's inequality.
Let
Solution (Thomas M. Cover Joy A. Thomas): (Fano's Inequality.) The minimal probability of error predictor when there is no information is
since the maximum of
which is the unconditional form of Fano's inequality. We can weaken this inequality to obtain an explicit lower bound for
Solution: (is this correct?)
or
where
then
2.46 Axiomatic definition of entropy (Difficult).
If we assume certain axioms for our measure of information, we will be forced to use a logarithmic measure such as entropy. Shannon used this to justify his initial definition of entropy. In this book we rely more on the other properties of entropy rather than its axiomatic derivation to justify its use. The following problem is considerably more difficult than the other problems in this section.
If a sequence of symmetric functions
- Normalization:
, - Continuity:
is a continuous function of , - Grouping:
prove that must be of the form
There are various other axiomatic formulations which result in the same definition of entropy. See, for example, the book by Csiszár and Körner [149].
3.13 Calculation of typical set.
To clarify the notion of a typical set
(a) Calculate
(b) With
(c) How many elements are there in the smallest set that has probability
(d) How many elements are there in the intersection of the sets in parts (b) and (c)? What is the probability of this intersection?
The table on book seems to be problematic. Here is my version.
Not finished answer, may be also problematic.
Clear [n, k, p]
n = 25;
p = 0.6;
Table[{k, Binomial[n, k],
Binomial[n, k] *p^k *(1 - p)^(n - k), (-1/n) *
Log[Binomial[n, k] *p^k *(1 - p)^(n - k)]/Log[2]}, {k, 0,
25}] // MatrixForm
Sol.
(a)
p = {0.4, 0.6};
hx = Total[-p Log [p]/Log[2]]
n = 25;
e = 0.1;
0.970951
(b)
(Hx + e)
(Hx - e)
1.07095
0.870951
Choose items whose
(c)
Clear[pr, n, k]
p = 0.6;
n = 25;
pr = Table[{k, Binomial[n, k] *p^k *(1 - p)^(n - k)}, {k, 0, 25}] ;
Total[pr[[2]]];
pr = SortBy[pr, #[[2]] &];
pr = Reverse[pr];
For[i = 2, i <= n, i++, pr[[i, 2]] = pr[[i, 2]] + pr[[i - 1, 2]]]
pr // MatrixForm
We can use a greedy approach to choose items, when the total probability reaches 0.9, we stop the approach. The matrix above shows the cumulative value of
(d) Omitted.
5.28 Shannon code.
p176
Consider the following method for generating a code for a random variable
the sum of the probabilities of all symbols less than
(a) Show that the code constructed by this process is prefix-free and that the average length satisfies
(b) Construct the code for the probability distribution
7.4 Channel capacity.
Consider the discrete memoryless channel
and
(a) Find the capacity.
Solution
Since
the channel is a symmetric channel, if
Hence the channel capacity is
Reference
- Elements of information theory/by Thomas M. Cover, Joy A. Thomas.–2nd ed.
本文来自博客园,作者:K1øN,转载请注明原文链接:https://www.cnblogs.com/kion/p/16988361.html
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· Manus重磅发布:全球首款通用AI代理技术深度解析与实战指南
· 被坑几百块钱后,我竟然真的恢复了删除的微信聊天记录!
· 没有Manus邀请码?试试免邀请码的MGX或者开源的OpenManus吧
· 园子的第一款AI主题卫衣上架——"HELLO! HOW CAN I ASSIST YOU TODAY
· 【自荐】一款简洁、开源的在线白板工具 Drawnix