Pose-Aware Face Recognition in the Wild Nets Trained on CASIA WebFace Data

Represent a facial image as a vector

Released in 2016, these models tackle the problem of the pose and viewpoint variations in facial recognition systems. Unlike other models that attempt to transform different poses and viewpoints to a canonical frontal pose, this set of models provides multiple pose-specific nets.

Number of models: 10

Training Set Information

Performance

Examples

Resource retrieval

Get the pre-trained net:

In[1]:=
NetModel["Pose-Aware Face Recognition in the Wild Nets Trained on \
CASIA WebFace Data"]
Out[1]=

NetModel parameters

This model consists of a family of individual nets, each identified by a specific parameter combination. Inspect the available parameters:

In[2]:=
NetModel["Pose-Aware Face Recognition in the Wild Nets Trained on \
CASIA WebFace Data", "ParametersInformation"]
Out[2]=

Pick a non-default net by specifying the parameters:

In[3]:=
NetModel[{"Pose-Aware Face Recognition in the Wild Nets Trained on \
CASIA WebFace Data", "Architecture" -> "VGG", "Pose" -> Quantity[40, "Degrees"]}]
Out[3]=

Pick a non-default uninitialized net:

In[4]:=
NetModel[{"Pose-Aware Face Recognition in the Wild Nets Trained on \
CASIA WebFace Data", "Architecture" -> "VGG", "Pose" -> Quantity[40, "Degrees"]}, "UninitializedEvaluationNet"]
Out[4]=

Evaluation function

Create an evaluation function that takes two facial images and outputs True if they belong to the same person and False if not:

In[5]:=
netevaluate[img1_, img2_, pose_, threshold_ : 0.1, architecture_ : "VGG"] := Block[
  {net, features},
  net = NetModel@{"Pose-Aware Face Recognition in the Wild Nets \
Trained on CASIA WebFace Data", "Pose" -> pose, "Architecture" -> architecture};
  features = net@{img1, img2};
  Correlation @@ features >= threshold
  ]

Basic usage

Predict whether two facial images belong to the same person or not using the evaluation function:

In[6]:=
(* Evaluate this cell to get the example input *) CloudGet["https://www.wolframcloud.com/obj/d34c296c-5009-48d4-835b-2704bd767d51"]
Out[6]=
In[7]:=
(* Evaluate this cell to get the example input *) CloudGet["https://www.wolframcloud.com/obj/265a43b1-321d-4dc0-82cb-925be03dbb96"]
Out[7]=

Net information

Inspect the number of parameters of all arrays in the net:

In[8]:=
Information[
 NetModel["Pose-Aware Face Recognition in the Wild Nets Trained on \
CASIA WebFace Data"], "ArraysElementCounts"]
Out[8]=

Obtain the total number of parameters:

In[9]:=
Information[
 NetModel["Pose-Aware Face Recognition in the Wild Nets Trained on \
CASIA WebFace Data"], "ArraysTotalElementCount"]
Out[9]=

Obtain the layer type counts:

In[10]:=
Information[
 NetModel["Pose-Aware Face Recognition in the Wild Nets Trained on \
CASIA WebFace Data"], "LayerTypeCounts"]
Out[10]=

Export to MXNet

Export the net into a format that can be opened in MXNet:

In[11]:=
jsonPath = Export[FileNameJoin[{$TemporaryDirectory, "net.json"}], NetModel["Pose-Aware Face Recognition in the Wild Nets Trained on \
CASIA WebFace Data"], "MXNet"]
Out[11]=

Export also creates a net.params file containing parameters:

In[12]:=
paramPath = FileNameJoin[{DirectoryName[jsonPath], "net.params"}]
Out[12]=

Get the size of the parameter file:

In[13]:=
FileByteCount[paramPath]
Out[13]=

Requirements

Wolfram Language 12.0 (April 2019) or above

Resource History

Reference