Title : Introduction to information theory and coding

Course No : EE5142

Credits : 4

Prerequisite : Probability and Random Processes, Digital Communications

Syllabus :

1) Entropy, Relative Entropy, and Mutual Information:

Entropy, Joint Entropy and Conditional Entropy, Relative Entropy and Mutual Information, Chain Rules, Data-Processing Inequality, Fano’s Inequality

2) Typical Sequences and Asymptotic Equipartition Property:

Asymptotic Equipartition Property Theorem, Consequences of the AEP: Data Compression, High-Probability Sets and the Typical Set

3) Source Coding and Data Compression:

Kraft Inequality, Huffman Codes, Optimality of Huffman Codes

4) Channel Capacity:

Symmetric Channels, Properties of Channel Capacity, Jointly Typical Sequences, Channel Coding Theorem, Fano’s Inequality and the Converse to the Coding Theorem

5) Differential Entropy and Gaussian Channel:

Differential Entropy, AEP for Continuous Random Variables, Properties of Differential Entropy, Relative Entropy, and Mutual Information, Coding Theorem for Gaussian Channels

6) Linear Binary Block Codes:

Introduction, Generator and Parity-Check Matrices, Repetition and Single-Parity-Check Codes, Binary Hamming Codes, Error Detection with Linear Block Codes, Weight Distribution and Minimum Hamming Distance of a Linear Block Code, Hard-decision and Soft-decision Decoding of Linear Block Codes, Cyclic Codes, Parameters of BCH and RS Codes, Interleaved and Concatenated Codes

7) Convolutional Codes:

Encoder Realizations and Classifications, Minimal Encoders, Trellis representation, MLSD and the Viterbi Algorithm, Bit-wise MAP Decoding and the BCJR Algorithm

Text Books :

  1. Elements of Information Theory by Thomas Cover, Joy Thomas
  2. Channel Codes: Classical and Modern by William Ryan, Shu Lin

References :

  1. Information Theory and Reliable Communication by Robert Gallager