Wolfram LaTeX Character-Level Language Model V1

Generate LaTeX code

This language model is based on a simple stack of gated recurrent layers. It was trained by Wolfram Research in 2018 using teacher forcing on sequences of length 100.

Number of layers: 7 | Parameter count: 7,896,330 | Trained size: 32 MB |

Training Set Information

Examples

Resource retrieval

Get the pre-trained network:

In[1]:=
NetModel["Wolfram LaTeX Character-Level Language Model V1"]
Out[1]=

Basic usage

Predict the next character of a given sequence:

In[2]:=
NetModel["Wolfram LaTeX Character-Level Language Model V1"]["\\begin"]
Out[2]=

Get the top 15 probabilities:

In[3]:=
topProbs = NetModel["Wolfram LaTeX Character-Level Language Model V1"][
  "\\begin", {"TopProbabilities", 15}]
Out[3]=

Plot the top 15 probabilities:

In[4]:=
BarChart[Thread@
  Labeled[Values@topProbs, Keys[topProbs] /. {"\n" -> "\\n", "\t" -> "\\t"}], ScalingFunctions -> "Log"]
Out[4]=

Requirements

Wolfram Language 11.3 (March 2018) or above

Resource History

Reference

  • Wolfram Research