Intuitive walk through different important 3 interrelated concepts of machine learning: Information, Entropy and Kullback-Leibler Divergence.

Introduction : How to measure distributions ?

Before we dive in deep about what is entropy, information and KL divergence, we must first understand the need for these terms and what problems does this solve.

In our basic statistics/maths course we have come across many different distributions. Some of them are namely : Gaussian, Bernoulli, Beta etc. The probability density function(p.d.f.) or probability mass function(p.m.f.) differ from one distribution to other. We say both distributions are equal iff. both have exact same p.d.f or p.m.f. Now the question arises if two…

Deeptendu Santra

Machine Learning and Open Source Development

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store