Home | Research | Papers | Talks | Teaching |

This is an introductory course on information theory.
I will be covering the first ten chapters (excepting chapter six) of the book Elements of Information Theory (2nd ed.) by Cover and Thomas.
Depending on the time some additional topics might be covered.
Students are required to have taken a course on
probability before they can credit this course.

Students who need COT can come by office (ESB 336B) before or after C slot from 31/10/14.

Textbook: Elements of Information Theory (2nd ed.) T. M. Cover and J. A. Thomas.

Course topics.

- Introduction to information theory
- Entropy, relative entropy and mutual information
- Asymptotic equipartition property
- Entropy rate of a stochastic process
- Data compression
- Channel capacity
- Differential entropy
- Gaussian channel
- Rate distortion theory and other topics (time permitting)

Grading policy (Tentative)

10-15% Homework+Projects, 10% Miniquizzes, 25-30% Midsem, 50% Finals.

Midsem 3 Mar 2015

Endsem 1 May 2015

Lectures

- 12 Jan Introduction

- 14 Jan Entropy, joint and conditional entropies

- 16 Jan Relative entropy, mutual information

- 20 Jan Mutual information, set theoretic correspondence of entropies

- 23 Jan Jensen's inequality and consequences

- 27 Jan Log sum inequality and consequences

- 28 Jan Applications of log sum inequality and data processing inequality

- 30 Jan Tutorial

- 02 Feb Fano's inequality

- 03 Feb AEP, Miniquiz-1

- 04 Feb Consequences of AEP, typical sets, data compression

- 10 Feb Tutorial

- 11 Feb Entropy rates of stochastic processes, Miniquiz-2

- 13 Feb Entropy rates and Markov chains

- 16 Feb Entropy rates of Markov chains, random walks and functions of Markov chains

- 18 Feb Data compression: Source coding, Kraft's inequality

- 20 Feb Tutorial, Miniquiz

- 23 Feb Data compression: Extended Kraft's inequality, optimal codes

- 25 Feb Uniquely decodable codes, Huffman codes

- 27 Feb Optimality of Huffman codes

- 02 Mar Huffman codes and Shannon-Fano-Elias codes

- 03 Mar Midsem exam

- 04 Mar Wrapup on data compression and introduction to data transmission

- 09 Mar Channel capacity, computing capacity for some simple channels

- 11 Mar Midsem paper discussion, introduction to jointly typical sequences

- 13 Mar Joint AEP

- 16 Mar Noisy channel coding theorem

- 18 Mar Noisy channel coding theorem

- 20 Mar Tutorial

- 23 Mar Feedback capacity

- 25 Mar Source channel separation theorem, Miniquiz

- 27 Mar Differential entropy

- 30 Mar Properties of differential entropy and related quantities

- 31 Mar AEP and wrap up on differential entropy, Project posted

- 06 Apr Tutorial

- 08 Apr Gaussian channel, channel capacity, Joint AEP

- 10 Apr Capacity of the Gaussian channel

- 13 Apr Capacity of the Gaussian channel

- 15 Apr Bandlimited Gaussian channels

- 20 Apr Parallel Gaussian channels

- 21 Apr Tutorial

- 22 Apr Gaussian channels with color noise

- 24 Apr Wrap up, Miniquiz

- 01 May Endsem