Information Theory answers two fundamental questions: what is the maximum data rate at which we can transmit over a communication link, and what is the fundamental limit of data compression. In this course we will explore answers to these two questions. We will study some practice source compression algorithms. We will also study how to compute channel capacity of simple channels. Intended Audience : 3rd/4th year UG students in EC stream, 1st year PG students in communications and signal processing specializationPrerequisites : Basic knowledge of probability theory and digital communications Industries Support : Communication companies, defense laboratories
Week 1: Introduction: Entropy, Relative Entropy, Mutual Information; Information Inequalities;Week 2: Block to variable length coding-I: Prefix-free code Block to variable length coding-II: Bounds on optimal codelength; Block to variable length coding-III: Huffman coding.Week 3: Variable to block length coding The asymptotic equipartition property Block to block coding of DMS Week 4: Universal Source Coding-I: Lempel-Ziv Algorithm-LZ77 Universal source coding-II: Lempel-Ziv Welch Algorithm (LZW)Week 5: Coding for sources with memory Channel capacity of discrete memoryless channels.Week 6: Joint typical sequences Noisy channel coding theorem; Differential entropy;Week 7: Gaussian Channel; Parallel Gaussian Channel.Week 8: Rate Distortion Theory; Blahut-Arimoto Algorithm for computation of channel capacity and rate- distortion function.