Function Repository Resource:

NetParallelOperator

Source Notebook

Perform multiple operations on an input in a neural net

Contributed by: Sjoerd Smit

ResourceFunction["NetParallelOperator"][{net1,net2,}]

represents a net with a single input and multiple outputs which correspond to applying the different networks to the input.

ResourceFunction["NetParallelOperator"][<|out1 net1,out2net2,|>]

specifies that the output of neti should be linked to output port outi.

ResourceFunction["NetParallelOperator"][spec, cat]

uses a layer or net to combine the outputs into a single array again.

ResourceFunction["NetParallelOperator"][spec,Automatic]

catenates the outputs sequentially using CatenateLayer[0].

Examples

Basic Examples (2) 

Define a net that computes the Sin, Cos and Tan of an input:

In[1]:=
net = ResourceFunction[
  "NetParallelOperator"][{ElementwiseLayer[Sin], ElementwiseLayer[Cos], ElementwiseLayer[Tan] }]
Out[1]=

Apply it to one or more values:

In[2]:=
net[1]
Out[2]=
In[3]:=
net[N@Range[0, 2 \[Pi], \[Pi]/4]]
Out[3]=

Specify custom names for the output ports:

In[4]:=
net = ResourceFunction[
  "NetParallelOperator"][<|"sin" -> ElementwiseLayer[Sin], "cos" -> ElementwiseLayer[Cos], "tan" -> ElementwiseLayer[Tan]|>]
Out[4]=
In[5]:=
net[1]
Out[5]=

Scope (2) 

Convert the output to a single array again by joining the results:

In[6]:=
net = ResourceFunction["NetParallelOperator"][
  {ElementwiseLayer[Sin], ElementwiseLayer[Cos], ElementwiseLayer[Tan] },
  Automatic
  ]
Out[6]=
In[7]:=
net[1]
Out[7]=
In[8]:=
net[N@Range[0, 2 \[Pi], \[Pi]/4]]
Out[8]=

Use a different operation for combining the results into tuples:

In[9]:=
net = ResourceFunction["NetParallelOperator"][
  {ElementwiseLayer[Sin], ElementwiseLayer[Cos], ElementwiseLayer[Tan] },
  NetChain[{CatenateLayer[0], TransposeLayer[]}]
  ]
Out[9]=
In[10]:=
net[N@Range[0, 2 \[Pi], \[Pi]/4]]
Out[10]=

Applications (5) 

Create a network that can find a matrix with specific row and column sums while keeping the size of the elements as small as possible. NetParallelOperatorcan be used to calculate the required sums:

In[11]:=
net = ResourceFunction["NetParallelOperator"][
  <|
   "RowSums" -> AggregationLayer[Total, 2],
   "ColumnSums" -> AggregationLayer[Total, 1],
   "SumOfSquares" -> NetChain[{
      ElementwiseLayer[0.01*#^2 &],(* Multiply this with a small scaling factor to make sure the row and column losses are treated with higher priority during training *)
      AggregationLayer[Total, All]
      }] |>
  ]
Out[11]=

Define a training net with a learnable matrix:

In[12]:=
trainingNet = NetGraph[
  <|
   "mat" -> NetArrayLayer[],
   "sums" -> net,
   "mse1" -> MeanSquaredLossLayer[],
   "mse2" -> MeanSquaredLossLayer[]
   |>,
  {
   "mat" -> "sums",
   {NetPort["RowSums"], NetPort["sums", "RowSums"]} -> "mse1" -> NetPort["RowLoss"],
   {NetPort["ColumnSums"], NetPort["sums", "ColumnSums"]} -> "mse2" -> NetPort["ColumnLoss"],
   NetPort["sums", "SumOfSquares"] -> NetPort["SumOfSquaresLoss"]
   }
  ]
Out[12]=

Train the net to find a matrix with the given row and column sums:

In[13]:=
input = <|"RowSums" -> {{5, 3, 2}}, "ColumnSums" -> { {1, 2, 3, 4}}|>;
trainedNet = NetTrain[trainingNet,
  input,
  LossFunction -> {"RowLoss", "ColumnLoss", "SumOfSquaresLoss"},
  TimeGoal -> 10
  ]
Out[14]=

Extract the matrix found by the model:

In[15]:=
Normal@NetExtract[trainedNet, {"mat", "Array"}]
Out[15]=

Check the losses of the row and column deviations:

In[16]:=
trainedNet[input]
Out[16]=

Publisher

Sjoerd Smit

Version History

  • 1.0.0 – 11 October 2022

Related Resources

License Information