Function Repository Resource:

# MultinormalKLDivergence

Compute the Kullback–Leibler divergence between two multinormal distributions

Contributed by: Seth J. Chandler
 ResourceFunction["MultinormalKLDivergence"][dist1,dist2] computes the Kullback–Leibler divergence between two multinormal distributions.

## Details

MultinormalKLDivergence will work with a univariate NormalDistribution or a bivariate BinormalDistribution. It treats them as special cases of the more general MultinormalDistribution.

## Examples

### Basic Examples (2)

Compute the Kullback–Leibler divergence for two multinormal distributions:

 In[1]:=
 Out[1]=

Plot how the Kullback–Leibler divergence varies as the center of the second multinormal distribution moves on a line extended from the origin:

 In[2]:=
 Out[2]=

### Scope (3)

Compute the Kullback–Leibler divergence of two normal distributions:

 In[3]:=
 Out[3]=

Compute the Kullback–Leibler divergence of two binormal distributions:

 In[4]:=
 Out[4]=

Compute the Kullback–Leibler divergence symbolically:

 In[5]:=
 Out[5]=

### Applications (3)

Compare the Kullback–Leibler divergence of two orthogonal multinormal distributions as parameters of their covariance matrices change:

 In[6]:=
 Out[6]=

Use a Manipulate to compare two multinormal distributions and show their Kullback–Leibler divergence:

 In[7]:=
 In[8]:=
 Out[8]=

Compute the symbolic Kullback–Leibler divergence for a multinormal distribution with itself, but with the second distribution undergoing a rotation of its covariance matrix:

 In[9]:=
 Out[9]=

The same computation, but with a three dimensional multinormal distribution:

 In[10]:=
 Out[10]=

### Properties and Relations (2)

One can use the more general resource function KullbackLeiblerDivergence, but MultinormalKLDivergence is somewhat faster for low-dimensional distributions:

 In[11]:=
 Out[11]=
 In[12]:=
 Out[12]=

MultinormalKLDivergence is much faster for higher-dimensional distributions:

 In[13]:=
 In[14]:=
 Out[14]=
 In[15]:=
 Out[15]=

### Possible Issues (2)

If the two multinormal distributions supplied have different dimensions, a Failure object will result:

 In[16]:=
 Out[16]=

If one or more of the distributions is not a multivariate normal distribution, a Failure object will result:

 In[17]:=
 Out[17]=

### Neat Examples (5)

Use the Kullback–Leibler divergence to help regularization in a variational autoencoder. Create an expression for the Kullback–Leibler divergence between a multinormal distribution with zero mean and a unit, diagonal covariance matrix:

 In[18]:=
 Out[18]=

Use the resulting expression to create a function layer that can be used for regularization:

 In[19]:=
 Out[19]=

Build a variational autoencoder architecture that uses the Kullback–Leibler divergence to prevent overfitting and to encourage the model to learn a compact and meaningful representation of the data:

 In[20]:=
 Out[20]=

Create structured data that one wants the variational autoencoder to capture:

 In[21]:=
 In[22]:=
 Out[22]=
 In[23]:=
 Out[23]=

Use a RadialAxisPlot to show the smooth and compact reconstruction of the original vector from a latent space:

 In[24]:=
 Out[24]=
 In[25]:=
 Out[25]=

Seth J. Chandler

## Version History

• 1.0.0 – 22 February 2023