Layer Statistics

Layer Statistics

Benanza provides an easy way to get statistics of the models. Benanza will read the model(s) and perform aggregations to count the number of layers and other useful information. The model_path can be a folder. In this case, the tool will find all models within the folder and compute the statistics for each model. The outputs are placed in output_path based on the model’s name. For example, we can get the layer statistics for all models using

#benanza layerstats -h
Usage:
  benanza layerstats [flags]

Aliases:
  layerstats, stats

Flags:
  -h, --help   help for layerstats

Global Flags:
  -b, --batch_size int       batch size (default 1)
  -f, --format string        print format to use (default "automatic")
      --full                 print all information about the layers
      --human                print flops in human form
  -d, --model_dir string     model directory
  -p, --model_path string    path to the model prototxt file
      --no_header            show header labels for output
  -o, --output_file string   output file name
benanza layerstats --model_path //models/path -f csv

This outputs a set of CSV files which can be used to create the following plot:

You can also output the data in JSON, TeX, or markdown table format. Here we output the layer statistics for AlexNet in JSON format

benanza layerstats --model_path //AlexNet --output_path assets/layer_stats --format json

The output is shown here:

Layer Statistics by Year

You can also get the layer statistics by year

benanza year_info
YEARTOTALUNIQUE
20101211
2012561163
20134734
2014349129
20151577190
20162286459
2017510127
201841245

This is used to create the plot

Model Plot

The layer stats can also plot the model. You can generate a plot of AlexNet using:

benanza layerstats --model_path //BVLC_AlexNet --format dot -o /tmp/alexnet.dot

Which can be processed using dot to produce.



%0



data_0

data_0



conv1_1

conv_1

CONV



data_0->conv1_1





conv1_w_0

conv1_w_0



conv1_w_0->conv1_1





conv1_b_0

conv1_b_0



conv1_b_0->conv1_1





conv1_2

relu_1

RELU



conv1_1->conv1_2





norm1_1

lrn_1

LRN



conv1_2->norm1_1





pool1_1

maxpool_1

MXPL



norm1_1->pool1_1





conv2_1

conv2_2

CONV



pool1_1->conv2_1





conv2_w_0

conv2_w_0



conv2_w_0->conv2_1





conv2_b_0

conv2_b_0



conv2_b_0->conv2_1





conv2_2

relu_2

RELU



conv2_1->conv2_2





norm2_1

lrn_2

LRN



conv2_2->norm2_1





pool2_1

maxpool_2

MXPL



norm2_1->pool2_1





conv3_1

conv_3

CONV



pool2_1->conv3_1





conv3_w_0

conv3_w_0



conv3_w_0->conv3_1





conv3_b_0

conv3_b_0



conv3_b_0->conv3_1





conv3_2

relu_3

RELU



conv3_1->conv3_2





conv4_1

conv4_2

CONV



conv3_2->conv4_1





conv4_w_0

conv4_w_0



conv4_w_0->conv4_1





conv4_b_0

conv4_b_0



conv4_b_0->conv4_1





conv4_2

relu_4

RELU



conv4_1->conv4_2





conv5_1

conv5_2

CONV



conv4_2->conv5_1





conv5_w_0

conv5_w_0



conv5_w_0->conv5_1





conv5_b_0

conv5_b_0



conv5_b_0->conv5_1





conv5_2

relu_5

RELU



conv5_1->conv5_2





pool5_1

maxpool_3

MXPL



conv5_2->pool5_1





OC2_DUMMY_0

reshape_1

RSHP



pool5_1->OC2_DUMMY_0





OC2_DUMMY_1

OC2_DUMMY_1



OC2_DUMMY_1->OC2_DUMMY_0





fc6_1

gemm_1

GEMM



OC2_DUMMY_0->fc6_1





fc6_w_0

fc6_w_0



fc6_w_0->fc6_1





fc6_b_0

fc6_b_0



fc6_b_0->fc6_1





fc6_2

relu_6

RELU



fc6_1->fc6_2





fc6_3

dropout_1

DRP



fc6_2->fc6_3





_fc6_mask_1

dropout_1

DRP



fc6_2->_fc6_mask_1





fc7_1

gemm_2

GEMM



fc6_3->fc7_1





fc7_w_0

fc7_w_0



fc7_w_0->fc7_1





fc7_b_0

fc7_b_0



fc7_b_0->fc7_1





fc7_2

relu_7

RELU



fc7_1->fc7_2





fc7_3

dropout_2

DRP



fc7_2->fc7_3





_fc7_mask_1

dropout_2

DRP



fc7_2->_fc7_mask_1





fc8_1

gemm_3

GEMM



fc7_3->fc8_1





fc8_w_0

fc8_w_0



fc8_w_0->fc8_1





fc8_b_0

fc8_b_0



fc8_b_0->fc8_1





prob_1

softmax_1

SFT



fc8_1->prob_1




The output is dependent on the batch size, so we can modify the batch size and examine the layer shapes. Here we draw the graph using 256 as the batch size .

benanza layerstats --model_path //BVLC_AlexNet --format dot -o /tmp/alexnet.dot --batch_size=256

The graph topology is the same, but has different input/output shapes



%0



data_0

data_0



conv1_1

conv_1

CONV



data_0->conv1_1





conv1_w_0

conv1_w_0



conv1_w_0->conv1_1





conv1_b_0

conv1_b_0



conv1_b_0->conv1_1





conv1_2

relu_1

RELU



conv1_1->conv1_2





norm1_1

lrn_1

LRN



conv1_2->norm1_1





pool1_1

maxpool_1

MXPL



norm1_1->pool1_1





conv2_1

conv2_2

CONV



pool1_1->conv2_1





conv2_w_0

conv2_w_0



conv2_w_0->conv2_1





conv2_b_0

conv2_b_0



conv2_b_0->conv2_1





conv2_2

relu_2

RELU



conv2_1->conv2_2





norm2_1

lrn_2

LRN



conv2_2->norm2_1





pool2_1

maxpool_2

MXPL



norm2_1->pool2_1





conv3_1

conv_3

CONV



pool2_1->conv3_1





conv3_w_0

conv3_w_0



conv3_w_0->conv3_1





conv3_b_0

conv3_b_0



conv3_b_0->conv3_1





conv3_2

relu_3

RELU



conv3_1->conv3_2





conv4_1

conv4_2

CONV



conv3_2->conv4_1





conv4_w_0

conv4_w_0



conv4_w_0->conv4_1





conv4_b_0

conv4_b_0



conv4_b_0->conv4_1





conv4_2

relu_4

RELU



conv4_1->conv4_2





conv5_1

conv5_2

CONV



conv4_2->conv5_1





conv5_w_0

conv5_w_0



conv5_w_0->conv5_1





conv5_b_0

conv5_b_0



conv5_b_0->conv5_1





conv5_2

relu_5

RELU



conv5_1->conv5_2





pool5_1

maxpool_3

MXPL



conv5_2->pool5_1





OC2_DUMMY_0

reshape_1

RSHP



pool5_1->OC2_DUMMY_0





OC2_DUMMY_1

OC2_DUMMY_1



OC2_DUMMY_1->OC2_DUMMY_0





fc6_1

gemm_1

GEMM



OC2_DUMMY_0->fc6_1





fc6_w_0

fc6_w_0



fc6_w_0->fc6_1





fc6_b_0

fc6_b_0



fc6_b_0->fc6_1





fc6_2

relu_6

RELU



fc6_1->fc6_2





fc6_3

dropout_1

DRP



fc6_2->fc6_3





_fc6_mask_1

dropout_1

DRP



fc6_2->_fc6_mask_1





fc7_1

gemm_2

GEMM



fc6_3->fc7_1





fc7_w_0

fc7_w_0



fc7_w_0->fc7_1





fc7_b_0

fc7_b_0



fc7_b_0->fc7_1





fc7_2

relu_7

RELU



fc7_1->fc7_2





fc7_3

dropout_2

DRP



fc7_2->fc7_3





_fc7_mask_1

dropout_2

DRP



fc7_2->_fc7_mask_1





fc8_1

gemm_3

GEMM



fc7_3->fc8_1





fc8_w_0

fc8_w_0



fc8_w_0->fc8_1





fc8_b_0

fc8_b_0



fc8_b_0->fc8_1





prob_1

softmax_1

SFT



fc8_1->prob_1




The method by which to generate a plot which also displays the layer shapes is described in the layer shape information section.