Title : EE3110 Or Equivalent

Course No : EE5162

Credits : 3

Prerequisite :

Syllabus :

  1. Entropy, relative entropy and mutual information
  2. Asymptotic equipartition property
  3. Entropy rate of a stochastic process
  4. Data compression
  5. Channel capacity
  6. Differential entropy
  7. Gaussian channel
  8. Rate distortion theory

Text Books:

Elements of Information Theory, by T. M. Cover and J. A. Thomas, 2nd Edition, John Wiley & Sons.

Reference Books :

A First Course in Information Theory by R. Yeung