INFORMATION THEORY(ELEN E6717)

Syllabus

Date
Topics Covered
Chapter
Assigned
Due
Jan 18
Course introduction.  Information, Uncertainty, and Entropy, Entropy and Data Compression: the Source Coding Theorem.  Probability review
2.1- 2.2, 3.1
HW #1
 
Jan 25
The Source Coding Theorem:. The Asymptotic Equipartition Property and typical sets.
Chain Rule For Entropy. Mutual Information and Entropy.
Jensen's inequality and Consequences.
2.3 to  2.8, 2.11,
  3.2
 HW #2
HW # 1 
Feb 1
Entropy rates of stochastic processes. Lossless Data Compresion: Shannon's first Theorem, Shannon's Code, Kraft's Inequality
3.3, 4.1, 4.2, 5.1,  to 5.4
HW #3
HW #2
Feb 8
Huffman Codes, Lossless Data Compression and Codes: Shannon-Fano-Elias codes,
5.5 to 5.10
HW #4
 HW #3
Feb 15 Arithmetic Coding, Tunstall Codes, Dictionary-Based codes, LZ77 and LZSS
5.9 to 5.12,  additional material provided via e-mail
HW #5
HW #4 
Feb 22
LZ78 and LZW, optimality of LZ1
additional material provided via e-mail
 HW #6
HW #5
March 1
Introduction to Channel Coding
8.1 to 8.8 
 

March 8
Midterm
Covers up to HW6 included


March 15
Spring Break


 
      March 22  
The Channel Coding Theorem. Differential Entropy. 8.9 to 8.12
9.1 to 9.5

HW #7

March 29 The Gaussian Channel
10.1 to 10.6
HW #8
HW #7
April 5
Rate-Distortion Theory
13
HW #9
HW #8
April 12
Rate-Distortion Theory
13
HW #10
HW #9
April 19
Information Theory and Biology
 
 HW #11
HW  #10
April 26
Information Theory and Biology

HW #11
May 3
Review Session (by arrangement)



May 10
FINAL EXAM
 
 
 For more information, comments, or suggestions, please email us at [email protected].