Function Repository Resource:

RuleNetGraph

Source Notebook

Create a NetGraph with a simplified rule-based interface

Contributed by: Alec Graves

ResourceFunction["RuleNetGraph"][layer1layer2layern]

creates a NetGraph object from the specified layers.

ResourceFunction["RuleNetGraph"][layer1layer2layern,arg1val1]

builds a NetGraph using the specified layers with argval as an additional option for NetGraph.

Details

The first argument of ResourceFunction["RuleNetGraph"] should be a collection of rules relating neural network layers and NetPort objects.
Named layers are constructed using Subscript[layer, "StringName"]. Use as a hot-key to enter a subscript.
Named layers are referenced using strings.
Specifying input array shapes, a NetEncoder, or a NetDecoder for a NetPort can be done with options of the form "port"shape. The shape specification is the same as for NetGraph.
ResourceFunction["RuleNetGraph"] supports all options of NetGraph (such as LearningRateMultipliers).

Examples

Basic Examples (2) 

Build a simple NetGraph using a list of rules:

In[1]:=
ResourceFunction["RuleNetGraph"][LogisticSigmoid -> (#*17 + 8 &)]

Use lists of NetPort objects and layers for more complicated networks:

In[2]:=
ResourceFunction["RuleNetGraph"][{NetPort["Input1"], NetPort["Input2"]}
  -> {Plus, Times} -> CatenateLayer[] -> LinearLayer[3] -> SoftmaxLayer[]]

Scope (3) 

Use string subscripts to manually specify layer names:

In[3]:=
ResourceFunction["RuleNetGraph"][{
   
\!\(\*SubscriptBox[\(LinearLayer[32]\), \("\<lin32\>"\)]\) -> 
\!\(\*SubscriptBox[\(LinearLayer[64]\), \("\<lin64\>"\)]\),
   "lin32"
   } -> 
\!\(\*SubscriptBox[\(CatenateLayer[]\), \("\<cat\>"\)]\) -> Ramp -> 
\!\(\*SubscriptBox[\(LinearLayer[3]\), \("\<logits\>"\)]\) -> 
\!\(\*SubscriptBox[\(SoftmaxLayer[]\), \("\<pred\>"\)]\) ]
Out[3]=
In[4]:=
ResourceFunction["RuleNetGraph"][NetPort["Input"] -> 
\!\(\*SubscriptBox[\(ElementwiseLayer["\<Sigmoid\>"]\), \("\<MySigmoid\>"\)]\)]
Out[4]=

The above is equivalent to:

In[5]:=
ResourceFunction["RuleNetGraph"][NetPort["Input"] ->
  Subscript[ElementwiseLayer["Sigmoid"], "MySigmoid"]]

And is also equivalent to:

In[6]:=
NetGraph[<|"MySigmoid" -> ElementwiseLayer["Sigmoid"]|>,
 {NetPort["Input"] -> "MySigmoid"}]

A list of rules can be used to specify network connections in any order:

In[7]:=
ResourceFunction["RuleNetGraph"][{
  {"act1", "act2"} -> Times,
  NetPort["Input1"] -> 
\!\(\*SubscriptBox[\(ElementwiseLayer["\<Sigmoid\>"]\), \("\<act1\>"\)]\),
  NetPort["Input1"] -> 
\!\(\*SubscriptBox[\(SoftmaxLayer[]\), \("\<act2\>"\)]\)
  }]
Out[7]=

Networks can be specified with multiple inputs and outputs:

In[8]:=
ResourceFunction["RuleNetGraph"][{
  {NetPort["Sin"] -> 
\!\(\*SubscriptBox[\((#^2 &)\), \("\<s^2\>"\)]\), NetPort["Cos"] -> 
\!\(\*SubscriptBox[\((#^2 &)\), \("\<c^2\>"\)]\)} -> Plus -> NetPort["Sum"],
  {NetPort["Cos"], NetPort["Sin"]} -> ThreadingLayer[(ArcTan[#cos, #sin] &)] -> NetPort["Angle"]
  }]
Out[8]=

Options (1) 

Additional arguments are passed through RuleNetGraph to NetGraph (for example, input port size):

In[9]:=
ResourceFunction["RuleNetGraph"][NetPort["Input"] -> ConvolutionLayer[32, 3, "Stride" -> 2 , "PaddingSize" -> 1]
   -> BatchNormalizationLayer[]
    -> Ramp, "Input" -> {3, 32, 32}]

Applications (1) 

RuleNetGraph can easily describe networks with complicated data flow:

In[10]:=
denseBnRelu[n_Integer] := NetChain[{
   LinearLayer@n, BatchNormalizationLayer[], Ramp}]
ResourceFunction["RuleNetGraph"][{{
    {
      
\!\(\*SubscriptBox[\(denseBnRelu[64]\), \("\<dense1\>"\)]\),
      
\!\(\*SubscriptBox[\(denseBnRelu[64]\), \("\<dense2\>"\)]\),
      denseBnRelu[128] -> denseBnRelu[64],
      denseBnRelu[128] -> denseBnRelu[64],
      NetPort["MidAdjust"]
      } -> 
\!\(\*SubscriptBox[\(Plus\), \("\<FirstPlus\>"\)]\) -> {denseBnRelu[
        64], denseBnRelu[64]},
    "dense1",
    "dense2",
    NetPort["OutAdjust"],
    "FirstPlus" -> 
\!\(\*SubscriptBox[\((\((#^2)\)/2 &)\), \("\<x^2/2\>"\)]\)} -> Plus -> 
\!\(\*SubscriptBox[\(LinearLayer[12]\), \("\<logits\>"\)]\) ->
     
\!\(\*SubscriptBox[\(SoftmaxLayer[]\), \("\<pred\>"\)]\)}]
Out[11]=

Possible Issues (8) 

Multiple layers cannot be declared with the same manually-assigned name:

In[12]:=
ResourceFunction["RuleNetGraph"][NetPort["Input"] -> 
\!\(\*SubscriptBox[\(LogisticSigmoid\), \("\<MyActivation\>"\)]\) -> 
\!\(\*SubscriptBox[\((#*17 + 8 &)\), \("\<MyActivation\>"\)]\)]
Out[12]=

You cannot reuse layers in a way that causes the output of a layer to depend on itself:

In[13]:=
ResourceFunction["RuleNetGraph"][NetPort["Input"] -> 
\!\(\*SubscriptBox[\(LogisticSigmoid\), \("\<MyActivation\>"\)]\) -> "MyActivation"]
Out[13]=

Manual layer names must be specified with a string subscript; non-strings prompt a message:

In[14]:=
ResourceFunction["RuleNetGraph"][
 NetPort["Input"] -> Subscript[LogisticSigmoid, MyActivation]]
Out[14]=
In[15]:=
ResourceFunction["RuleNetGraph"][NetPort["Input"] -> 
\!\(\*SubscriptBox[\(LogisticSigmoid\), \("\<MyActivation\>"\)]\)]
Out[15]=

All layers must be properly declared:

In[16]:=
ResourceFunction["RuleNetGraph"][NetPort["Input"] -> SoftmaxLayer]
Out[16]=

Calling the layer with zero arguments works:

In[17]:=
ResourceFunction["RuleNetGraph"][NetPort["Input"] -> SoftmaxLayer[]]
Out[17]=

The first argument of RuleNetGraph must contain at least one Rule ( ):

In[18]:=
ResourceFunction["RuleNetGraph"][SoftmaxLayer[]]
Out[18]=
In[19]:=
ResourceFunction["RuleNetGraph"][NetPort["Input"] -> SoftmaxLayer[]]
Out[19]=

You cannot use string subscripts on NetPort or any other parameter that is not a manually named layer:

In[20]:=
ResourceFunction["RuleNetGraph"][
\!\(\*SubscriptBox[\(NetPort["\<Input1\>"]\), \("\<1\>"\)]\) -> SoftmaxLayer[]]
Out[20]=

You will get errors if you try to use a function that cannot be automatically converted to a neural network layer:

In[21]:=
ResourceFunction["RuleNetGraph"][
 NetPort["Input1"] -> (Transpose[#, {3, 2, 1}] &)]
Out[21]=
In[22]:=
ResourceFunction["RuleNetGraph"][
 NetPort["Input1"] -> TransposeLayer[{3, 2, 1}]]
Out[22]=

Be careful to properly group symbols. This gives an error:

In[23]:=
ResourceFunction["RuleNetGraph"][NetPort["Input1"] -> #*4 &]
Out[23]=

Giving the pure function proper precedence with parentheses fixes it:

In[24]:=
ResourceFunction["RuleNetGraph"][NetPort["Input1"] -> (#*4 &)]
Out[24]=

Publisher

Alec Graves

Version History

  • 1.0.0 – 10 January 2023

Related Resources

Author Notes

This implementation contains a hard-coded list of ThreadingLayer functions and ElementwiseLayer functions that can be automatically converted to neural network layers. RuleNetGraph will fail to automatically convert functions that are not in this list. As WRI changes this list in future releases, RuleNetGraph may not be able to automatically convert everything that can be theoretically converted to net layers in the latest version. In this situation, users may need to explicitly wrap newly supported functions like so: ElementwiseLayer[WRINewFancyFunction] or ThreadingLayer[WRINewFancyFunction].

License Information