ECE/TCOM-5583: Information Theory and Probabilistic Programming
There has been a strong resurgence of AI in recent years. An important core technology of AI is statistical learning, which aims to automatically “program” machines with data. While the idea can date back to the 50's of the last century, the plethora of data and inexpensive computational power allow the techniques to thrive and penetrate into every aspect of our daily lives — customer behavior prediction, financial market prediction, fully automatic surveillance, self-driving vehicles, autonomous robots, and beyond.
Information theory was first introduced and developed by the great communications engineer, Claude Shannon in the 50's of the last century. The theory was introduced in an attempt to explain the principle behind point-to-point communication and data storing. However, the technique has been incorporated into statistical learning and has inspire many of the underlying principles. In this graduate course, we would try to explore the exciting area of statistical learning from the perspectives of information theorists. It facilitates students to have a deeper understanding of the omnipresent field of statistical learning and to appreciate the wide-spread significance of information theory. Moreover, we will look into recent advance in probabilistic programming technology that facilitates users to tackle inference problems through computer programs.
The course will start by providing an overview of information theory and statistical learning. We will then aid students to establish a solid foundation on core information theory principles including information measures, AEP, source and channel coding theory. We will then introduce common and yet powerful statistical techniques such as Bayesian learning, decision forests, and belief propagation algorithms and discuss how these modern statistical learning techniques are connected to information theory. To summarize, we will skim through some probablistic programming tools. The main reference text is a book by Professor Mackay — Information Theory, Inference, and Learning Algorithms but we will also borrow heavily from materials available online. Other most important reference texts are
Other Reference Materials
Shannon, C. E. (1948), A mathematical theory of communication. Bell Sys. Tech. J. 27: 379-423, 623-656.
R. W. Yeung, On entropy, information inequalities, and Groups.
Law of Large Number by Terry Tao.
A First Course in Information Theory, by Raymond W. Yeung, New York: Springer.
Information Theory and Reliable Communication by R. Gallager, New York: Wiley.
Information Theory by Csiszar and Korner, New York: Academic Press.
Entropy and Information Theory by R. M. Gray, Springer-Verlag, 1990.
J. S. Yedidia, W. T. Freeman, and Y. Weiss, Understanding Belief Propagation and its Generalizations in Exploring Artificial Intelligence in the New Millennium: Science and Technology Books, 2003.
S. Verdu, Fifty years of Shannon theory, Information Theory, IEEE Transactions on, vol. 44, pp. 2057-2078, 1998.
I. Csiszar, The method of types, Information Theory, IEEE Transactions on, vol. 44, pp. 2505-2523, 1998.
Information Theoretic Inequality Prover.
Network Information Theory by El Gamal and Kim.
Stochastic Processes: Theory for Applications by Robert Gallager, 2013.
T Hofmann, B Schölkopf, AJ Smola, Kernel Methods in Machine Learning.
Office Hours
There are no “regular” office hours. But you are welcome to come catch me anytime or contact me through emails.
Course Syllabus (Tentative)
Probability review
Maximum likelihood estimator, MAP, Bayesian estimator
Graphical models and message passing algorithms
Lossless source coding theory, Huffmann coding, and introduction to Arithmetic coding
Asymptotic equipartition property (AEP), typicality and joint typicality
Entropy, conditional entropy, mutual information, and their properties
Channel coding theory, capacity, and Fano’s inequality
Continuous random variables, differential entropy, Gaussian source, and Gaussian channel
Error correcting codes, linear codes, and introduction to low-density parity check code
Methods of type, large deviation theory, maximum entropy principle
N.B. You will expect to expose to some Python and Matlab. You won't become an expert on these things after this class. But it is good to get your hands dirty and play with them early.
Adjustments for Pregnancy/Childbirth Related Issues
Should you need modifications or adjustments to your course requirements because of documented pregnancy-related or childbirth-related issues, please contact me as soon as possible to discuss. Generally, modifications will be made where medically necessary and similar in scope to accommodations based on temporary disability. Please see www.ou.edu/content/eoo/faqs/pregnancy-faqs.html for commonly asked questions.
Title IX Resources
For any concerns regarding gender-based discrimination, sexual harassment, sexual misconduct, stalking, or intimate partner violence, the University offers a variety of resources, including advocates on-call 24.7, counseling services, mutual no contact orders, scheduling adjustments and disciplinary sanctions against the perpetrator. Please contact the Sexual Misconduct Office 405-325-2215 (8-5, M-F) or OU Advocates 405-615-0013 (24.7) to learn more or to report an incident.
Projects
Final project report is due on 12/13.
Late Policy
Grading
Homework: 30%.
"Mid-term": 30%. Closed book test, and no Internet and cell-phones, but you can bring calculator and two pieces of letter-size cheat sheet
Final Project: 40%.
Final Grade:
A: \(\sim\) 80 and above
B: \(\sim\) between 60 and 80
C: \(\sim\) between 40 and 60
D: \(\sim\) between 20 and 40
F: Below 20
Calendar
| Topics | Materials |
8/21 | Overview of IT, probability overview, independence vs conditional independence,formal probability model (screencast1, screencast2) | (slides from Berkeley CS188), slides2019a |
8/28 | ML, MAP, Bayesian inference, constraint optimization, Lagrange multiplier, Karush-Kuhn-Tucker condition, overview of Source Coding Theorem, Kraft's inequality (screencast1, screencast2 ) | slides2019b |
9/4 | Proof of Source Coding Theorem, Shannon-Fano-Elias codes, forward proof of Source Coding Theorem, different entropy, entropy of Gaussian source (screencast1, screencast2) | |
9/11 | Jensen's inequality, KL-divergence, Thiel index, cross-entropy, joint entropy, conditional entropy (screencast1, screencast2) | HW1 |
9/18 | Mutual information, data processing inequality, Shannon's perfect secrecy, decision tree, random forest (screencast1, screencast2) | |
9/25 | Law of large number, asymptotic equal partition, typical sequences, joint typical sequences, packing lemma, solution of HW1 (screencast1, screencast2) | HW2 |
10/2 | Packing lemma, covering lemma, channel capacity, binary symmetric channel, Gaussian channel (screencast1, screencast2) | |
10/9 | Forward proof of channel coding theorem, Fano's inequality, solution of HW2 (screencast1, screencast2) | HW3 |
10/16 | Converse proof of channel coding theorem, Method of Type (screencast1, screencast2) | slides_2019c |
10/23 | Universal source coding, Lempel-Ziv, Sanov Theorem, Conditional Limit Theorem (screencast1, screencast2) | HW4 |
10/30 | Example of Conditional Limit Theorem, covariance matrices (screencast1, screencast2) | |
11/6 | Principal component analysis (PCA), marginalizing and conditioning of multivariate normal distribution (screencast1, screencast2) | solution of HW4, project abstract |
11/20 | Gaussian process, product of Gaussian distribution (screencast1, screencast2) | |
11/27 | Divison of Gaussian distribution, Gaussian mixture model, expectation-maximization (screencast1, screencast2) | HW5, EM |
12/3 | Conjugate prior, Beta distribution, Dirichlet distribution, exponential family (screencast1, screencast2, screencast3) |
|
|