The aim of this module is to introduce you to the abstract theory of the quantification, storage, and communication of information in a modern way. We will learn how to construct codes for compression and error correction. We will also study the fundamental limits of data compression and channel coding, culminating in the source-channel separation theorem. Along the way we will learn about many important tools required in the mathematical analysis of information processing problems and we will discuss various applications of information theory, for example to cryptography and learning theory.

The lecture notes here are provided with the understanding that they are properly attributed when used. I appreciate any constructive feedback.

- Fontmatter (pdf)
- Chapter 0: Review of mathematical notation and foundations (pdf)
- Chapter 1: Information measures (pdf)
- Chapter 2: Source coding (pdf)
- Chapter 3: Application to cryptography: Randomness extraction (pdf)
- Chapter 4: Application to statistics: Binary hypothesis testing (pdf)
- Chapter 5: Error correcting codes (pdf)
- Chapter 6: Noisy channel coding (pdf)
- Chapter 7: Application to learning theory: Complexity lower bounds (pdf)
- Endmatter (pdf)

The complete lecture notes in a single file are also available (pdf).