Intuitive walk through different important 3 interrelated concepts of machine learning: Information, Entropy and Kullback-Leibler Divergence.
Before we dive in deep about what is entropy, information and KL divergence, we must first understand the need for these terms and what problems does this solve.
In our basic statistics/maths course we have come across many different distributions. Some of them are namely : Gaussian, Bernoulli, Beta etc. The probability density function(p.d.f.) or probability mass function(p.m.f.) differ from one distribution to other. We say both distributions are equal iff. both have exact same p.d.f or p.m.f. Now the question arises if two…
Machine Learning and Open Source Development