|
No. |
|
|
|
|
|
|
Course
introduction. Probability review., The basic quantities of Information Theory |
|
|
|
|
|
The basic quantities: Entropy, Relative Entropy, Mutual Information. The Asymptotic Equipartition Property and typical sets: introduction to stochastic processes. |
3.1, 3.2 |
|
|
|
|
Entropy rates of stochastic processes: introduction to data compression. Lossless Data Compresion: Shannon's first Theorem, Shannon's Code, Kraft's Inequality |
|
|
|
|
|
Huffman Codes,
Lossless Data Compression and Codes:
Shannon-Fano-Elias codes, |
|
|
|
|
|
Arithmetic Coding, Tunstall Codes, Dictionary-Based codes, LZ77 and LZSS |
|
|
|
|
|
LZ78
and LZW, optimality of LZ1 |
|
|
|
|
|
Introduction
to Channel Coding |
|
|
|
|
|
More Channel Coding |
9.1 to 9.5 |
|
|
|
|
Midterm- Covers up to HW 3 included | |
||
Nov 4 |
- |
Election Day, University
Holiday |
|||
|
|
Linear Codes Introduction to Rate-Distortion Theory: Lossy Compression |
|
HW #5 |
|
|
|
More on Rate-Distortion Theory |
|
|
|
|
|
Vector
Quantization Information Theory and Statistics: the method of types. |
|
HW
#6
|
HW #5
|
|
|
Information
Theory and Statistics: the method of types. |
12.1 to 12.6 |
|
|
Dec
9 |
Review
Session |
HW #6 | |||
|
|
FINAL EXAM |
|