Kullback-Leibler Divergence
The Kullback-Leibler Divergence, or KL Divergence, is a measure of difference between two probability distributions. Given a 'true' distribution P, and an approximation Q, the KL Divergence of Q from P is a measure of the information lost when using Q instead of P. The KL Divergence is used as part of the loss function in the Variational Autoencoder, a generative ML model.