1 | Introduction, Review of Random Variables, Entropy, Mutual Information, Chain Rules | |
2 | Jensen's Inequality, Data Processing Theorem, Fanos's Inequality | PS 1 issued |
3 | Markov Chain, Entropy Rate of Random Processes | |
4 | Different Types of Convergence, Asymptotic Equipartition Property (AEP), Typical Set, Joint Typicality | PS 1 due PS 2 issued |
5 | Data Compression, Kraft Inequality, Optimal Codes | PS 2 due |
6 | Huffman Codes, Sensitivity of Distribution, Elias Code | |
7 | Gambling | PS 3 issued |
8 | Channel Capacity, Symmetric and Erasure Channels | PS 3 due PS 4 issued |
9 | Coding Theorem | |
10 | Strong Coding Theorem | |
11 | Strong Coding Theorem (cont.) | PS 4 due PS 5 issued |
12 | Feedback Capacity | PS 5 due |
13 | Joint Source Channel Coding | |
14 | Differential Entropy | PS 6 issued |
15 | Gaussian Channel | |
16 | Gaussian Channels: Parallel, Colored Noise, Inter-symbol Interference | PS 6 due PS 7 issued |
17 | Maximizing Entropy | |
18 | Gaussian Channels with Feedback | PS 7 due PS 8 issued |
19 | Fading Channels | PS 8 due PS 9 issued |
20 | Types, Universal Source Coding, Sanov's Theorem | PS 9 due PS 10 issued |
21 | Multiple Access Channels | |
22 | Slepian-Wolf Coding | PS 10 due |
23 | Broadcast Channels | |
24 | Channel Side Information, Wide-band Channels | |