ENEE 627:
Information
Theory
Spring 2018
Instructor :
Alexander Barg,
Professor
Department of Electrical and
Computer Engineering/Institute for Systems Research
Office: 2361 A.V.Williams Building. E-mail abarg umd edu
TA: Zitan Chen, Email: chenztan gmail com
TA office hours: Monday 2:30 - 4:00 AVW2456
Class times: Tuesday, Thursday 3:30-4:45pm AJC2121
Instructor availability outside class hours: after class (preferred), or send me an email
Course Homepage: http://www.ece.umd.edu/~abarg/627
Grading: Several (about 5) home assignments (20%), midterm1(25%), midterm2 (25%), final (30%).
Textbook: T. M. Cover and J. A. Thomas "Elements of Information Theory"
2nd Ed, Wiley Interscience, 2006
Lecture Notes in Statistics by John Duchi (Stanford)
Lecture Notes on Information Theory by Yury Polyanskiy (MIT) and Yihong Wu (Yale)
Other useful books (recommended, will not be used in an essential way):
I. Csiszár and J. Körner, Information Theory, Cambridge University Press 2011
R. Gallager, Information Theory and Reliable Communication, Wiley 1969
Documentary Claude Shannon Father of the Information Age.
Prerequisites: Probability and random processes.
Home assignments:
Problem Set 1 due in
class on 2/15/18 Solutions
Problem Set 2 due on 2/27
Solutions
Problem Set 3 due on 4/3
Solutions
Additional problems 1
Problem Set 4 due on 4/17
Solutions
Problem Set 5 due on 4/24
Solutions
Problem Set 6 due on 5/3
Solutions
Problem Set 7 (not graded)
Solutions
The following schedule forms a tentative contents of the course. It will be adjusted during the semester.
The distribution among the lectures is somewhat approximate.
The final exam is based on this list of topics.
Lecture 1. (1/2) Introduction to the course. Review of probability theory. The notions of entropy and mutual
information. (CT: Introduction)
Lecture 2. (1/30) Entropy, joint and conditional entropy, divergence. (CT: 2.1-2.6)
Lecture 3. (2/1) Convexity and inequalities (CT: 2.6,2.7)
Lecture 4. (2.6) Convexity of I(X;Y), Data processing, Fano's inequality (CT: 2.7,
2.8, 2.10)
Lecture 5. (2/8) Asymptotic Equipartition (CT: 3.1-3.3)
Lecture 6. (2/13) Types: a refinement of typical sequences (CT: 11.1, 11.2)
Lecture 7. (2/15) Types: a refinement of typical sequences (CT: 11.1, 11.2)
Lecture 8. (2/20) Data compression, unique decodability, Kraft's inequality (CT: 5.1,
5.2)
Lecture 9. (2/22) Optimal prefix codes (5.3,5.4). Block codes. MacMillan's theorem (CT: 5.5)
Lecture 10. (2/27) Huffman codes, optimality.
Lecture 11. (3/1) Shannon-Fano-Elias codes, optimality (CT:
5.6-5.10). Universal coding (Ch. 13) Arithmetic coding (13.3; 13.2)
Lecture 12. (3/6) Universal coding. LZ78 and its optimality.
Lecture 13. (3/8) LZ78 continued
Lecture 14. (3/27) Channel capacity. Definition and examples (CT: 7.1 +
notes).
Lecture 15. (3/29) Proof of the BSC capacity bound. The scaling problem. (notes)
Lecture 16. (4/3) Jointly typical sequences. Direct part of capacity theorem for DMC.
(CT: 7.4-7.7)
Lecture 17. (4/5) Converse for a DMC (CT: 7.9)
Lecture 18. (4/6) (makeup class) Symmetric channels. Source-channel separation. (CT.7.13). Review session for Midterm 2
Lecture 19. (4/17) Effective version of Shannon's theorem: Polar codes (notes);
watch youtube lectures 1 and 2
Lecture 20. (4/19) Polar codes on the BEC, completed
Lecture 21. (4/20) (makeup class) Continuous RVs. Differential entropy (CT: 8.1, 8.2). Gaussian RV has largest h(X) (CT: 8.5-8.6).
Lecture 22. (4/24) Capacity of the discrete-time Gaussian channel (CT 9.1-9.4)
Lecture 23. (4/26) Gaussian channel: Converse coding theorem; Band-limited AWGN channel; parallel Gaussian channels (CT 9.2-9.4)
Lecture 24. (5/1) Distances between probability distributions (Duchi). Pinkser's inequality (Duchi 2.2; CT 11.6). Hypothesis testing and Neyman-Pearson lemma (CT 11.7).
Lecture 25. (5/3) Estimation of the mean. Minimax risk. Fisher information and the Cramer-Rao bound. The Fano method (example). (Duchi Ch. 13)
Lecture 26. (5/8) From estimation to testing. Le Cam lemma. (Duchi 13.1-13.2)
Lecture 27. (5/10) Le Cam lemmas; Fano method (Duchi Ch.13); Large deviations (Stein's lemma; CT 11.7)
Main topics:
• Entropy, relative
entropy, mutual information Ch. 2
•
Typical sets, AEP property, Ch.3, Ch. 11 (Sect.11.1,11.2)
•
Data compression, source codes, Ch. 5
•
Capacity of discrete memoryless channels,
direct and converse coding theorems, Ch. 7 (notes)
•
Differential entropy (Ch. 8). Gaussian channel and its capacity
(Ch. 9)
•
Information theory and statistics (Ch. 11)