Lecture 1: Introduction, Entropy, Cross Entropy and Mutual Information
Lecture 2: Interpretation of information measures as expectations, some important
Lecture 3: More inequalities. Asymptotic Equipartition Property
Lecture 4: Typical sequences. Raw bits. Lossless data compression.
Lecture 5: Codes. prefix codes. Kraft Inequality, Huffman coding.
Lecture 6: Guest Lecture by Prof. Tony Jebara. Maximum Entropy methods.
Lecture 7: Optimality of Huffman Codes. Introduction to Channel Capacity and Coding.
Lecture 8: Binary Channels. Joint Typicality. Joint AEP theorem.
Lecture 9: Channel Coding Theorem