Basic Examples (2)
Compute the Kullback–Leibler divergence for two multinormal distributions:
Plot how the Kullback–Leibler divergence varies as the center of the second multinormal distribution moves on a line extended from the origin:
Scope (3)
Compute the Kullback–Leibler divergence of two normal distributions:
Compute the Kullback–Leibler divergence of two binormal distributions:
Compute the Kullback–Leibler divergence symbolically:
Applications (3)
Compare the Kullback–Leibler divergence of two orthogonal multinormal distributions as parameters of their covariance matrices change:
Use a Manipulate to compare two multinormal distributions and show their Kullback–Leibler divergence:
Compute the symbolic Kullback–Leibler divergence for a multinormal distribution with itself, but with the second distribution undergoing a rotation of its covariance matrix:
The same computation, but with a three dimensional multinormal distribution:
Properties and Relations (2)
One can use the more general resource function KullbackLeiblerDivergence, but MultinormalKLDivergence is somewhat faster for low-dimensional distributions:
MultinormalKLDivergence is much faster for higher-dimensional distributions:
Possible Issues (2)
If the two multinormal distributions supplied have different dimensions, a Failure object will result:
If one or more of the distributions is not a multivariate normal distribution, a Failure object will result:
Neat Examples (5)
Use the Kullback–Leibler divergence to help regularization in a variational autoencoder. Create an expression for the Kullback–Leibler divergence between a multinormal distribution with zero mean and a unit, diagonal covariance matrix:
Use the resulting expression to create a function layer that can be used for regularization:
Build a variational autoencoder architecture that uses the Kullback–Leibler divergence to prevent overfitting and to encourage the model to learn a compact and meaningful representation of the data:
Create structured data that one wants the variational autoencoder to capture:
Use a RadialAxisPlot to show the smooth and compact reconstruction of the original vector from a latent space: