Presenter(s)
2021 Croucher Summer Course in Information Theory, The Chinese University of Hong Kong
Lecture
Date
Abstract
Mutual information is a fundamental quantity in information theory. It is widely used in machine learning to measure statistical dependency among different features in data. Applications are numerous, ranging from classification, clustering, representation learning, and other tasks that require the selection/extraction of lower-dimensional features of the data without losing valuable information. Although mutual information has a precise formula defined in terms of a probability model, it must be estimated for real-world data with an unknown probability model. In this lecture series, we will dive into some of the applications and estimations of mutual information in machine learning. Registered participants will have hands-on coding experience using the virtual teaching and learning environment DIVE offered by CityU CS Department.