|
|
|
|
|
|
Course
introduction. Information, Uncertainty, and Entropy, Entropy and
Data Compression: the Source Coding Theorem.
Probability review |
|
|
|
|
The Source
Coding Theorem:. The Asymptotic
Equipartition Property and typical
sets. Chain Rule For Entropy. Mutual Information and Entropy. Jensen's inequality and Consequences. |
3.2 |
HW #2 |
|
|
Entropy rates of stochastic processes. Lossless Data Compresion: Shannon's first Theorem, Shannon's Code, Kraft's Inequality |
|
|
|
|
Huffman Codes,
Lossless Data Compression and Codes:
Shannon-Fano-Elias codes, |
|
HW #4
|
HW #3 |
Feb 15 | Arithmetic
Coding, Tunstall Codes, Dictionary-Based
codes, LZ77 and LZSS |
|
|
|
|
LZ78
and LZW, optimality of LZ1 |
|
|
|
|
Introduction
to Channel Coding |
|
|
|
|
Midterm |
|
|
|
|
Spring Break |
|
||
March 22
|
The Channel Coding Theorem. Differential Entropy. | 8.9 to 8.12 9.1 to 9.5 |
|
|
March 29 | The Gaussian Channel |
|
HW #8 |
|
|
Rate-Distortion Theory |
|
HW #9 |
|
|
Rate-Distortion Theory |
|
HW
#10
|
HW #9
|
|
Information Theory and Biology |
|
HW #11 |
|
April 26 |
Information Theory and Biology | HW
#11 |
||
May 3 |
Review Session (by arrangement) |
|||
|
FINAL EXAM |
|