【onnx】——python简单解析打印onnx模型信息
onnx前言:什么是onnx,以及onnx的介绍可以参考:ONNX学习笔记。当我们加载了一个ONNX之后,我们获得的就是一个ModelProto,它包含了一些版本信息,生产者信息和一个GraphProto。在GraphProto里面又包含了四个repeated数组,它们分别是node(NodeProto类型),input(ValueInfoProto类型),output(ValueInfoProt
onnx
前言:
什么是onnx,以及onnx的介绍可以参考:ONNX学习笔记。
当我们加载了一个ONNX之后,我们获得的就是一个ModelProto,它包含了一些版本信息,生产者信息和一个GraphProto。在GraphProto里面又包含了四个repeated数组,它们分别是node(NodeProto类型),input(ValueInfoProto类型),output(ValueInfoProto类型)和initializer(TensorProto类型),其中node中存放了模型中所有的计算节点,input存放了模型的输入节点,output存放了模型中所有的输出节点,initializer存放了模型的所有权重参数。
code
这里用python,将onnx中包含的有用的信息打印出来,进行一个直观可视化。
整体流程:
- 解析graph input
- 解析graph output
- 解析graph initializer,模型的所有权重参数
- 解析graph node,打印op信息,输入输出,得到整个计算的graph
import onnx
import numpy as np
import logging
logging.basicConfig(level=logging.INFO)
def onnx_datatype_to_npType(data_type):
if data_type == 1:
return np.float32
else:
raise TypeError("don't support data type")
def parser_initializer(initializer):
name = initializer.name
logging.info(f"initializer name: {name}")
dims = initializer.dims
shape = [x for x in dims]
logging.info(f"initializer with shape:{shape}")
dtype = initializer.data_type
logging.info(f"initializer with type: {onnx_datatype_to_npType(dtype)} ")
# print tenth buffer
weights = np.frombuffer(initializer.raw_data, dtype=onnx_datatype_to_npType(dtype))
logging.info(f"initializer first 10 wights:{weights[:10]}")
def parser_tensor(tensor, use='normal'):
name = tensor.name
logging.info(f"{use} tensor name: {name}")
data_type = tensor.type.tensor_type.elem_type
logging.info(f"{use} tensor data type: {data_type}")
dims = tensor.type.tensor_type.shape.dim
shape = []
for i, dim in enumerate(dims):
shape.append(dim.dim_value)
logging.info(f"{use} tensor with shape:{shape} ")
def parser_node(node):
def attri_value(attri):
if attri.type == 1:
return attri.i
elif attri.type == 7:
return list(attri.ints)
name = node.name
logging.info(f"node name:{name}")
opType = node.op_type
logging.info(f"node op type:{opType}")
inputs = list(node.input)
logging.info(f"node with {len(inputs)} inputs:{inputs}")
outputs = list(node.output)
logging.info(f"node with {len(outputs)} outputs:{outputs}")
attributes = node.attribute
for attri in attributes:
name = attri.name
value = attri_value(attri)
logging.info(f"{name} with value:{value}")
def parser_info(onnx_model):
ir_version = onnx_model.ir_version
producer_name = onnx_model.producer_name
producer_version = onnx_model.producer_version
for info in [ir_version, producer_name, producer_version]:
logging.info("onnx model with info:{}".format(info))
def parser_inputs(onnx_graph):
inputs = onnx_graph.input
for input in inputs:
parser_tensor(input, 'input')
def parser_outputs(onnx_graph):
outputs = onnx_graph.output
for output in outputs:
parser_tensor(output, 'output')
def parser_graph_initializers(onnx_graph):
initializers = onnx_graph.initializer
for initializer in initializers:
parser_initializer(initializer)
def parser_graph_nodes(onnx_graph):
nodes = onnx_graph.node
for node in nodes:
parser_node(node)
t = 1
def onnx_parser():
model_path = './resblock.onnx'
model = onnx.load(model_path)
# 0.
parser_info(model)
graph = model.graph
# 1.
parser_inputs(graph)
# 2.
parser_outputs(graph)
# 3.
parser_graph_initializers(graph)
# 4.
parser_graph_nodes(graph)
if __name__ == '__main__':
onnx_parser()
输出
INFO:root:onnx model with info:6
INFO:root:onnx model with info:pytorch
INFO:root:onnx model with info:1.7
INFO:root:input tensor name: input.1
INFO:root:input tensor data type: 1
INFO:root:input tensor with shape:[1, 3, 480, 640]
INFO:root:output tensor name: 22
INFO:root:output tensor data type: 1
INFO:root:output tensor with shape:[1, 64, 480, 640]
INFO:root:initializer name: conv1.bias
INFO:root:initializer with shape:[64]
INFO:root:initializer with type: <class ‘numpy.float32’>
INFO:root:initializer first 10 wights:[ 0.07732448 0.17457943 0.17851995 -0.05609495 0.14483029 -0.04734851
0.10502735 -0.18362212 0.12153099 0.16765712]
INFO:root:initializer name: conv1.weight
INFO:root:initializer with shape:[64, 3, 3, 3]
INFO:root:initializer with type: <class ‘numpy.float32’>
INFO:root:initializer first 10 wights:[ 0.02473598 -0.18695308 -0.12158576 -0.06351418 -0.13614939 -0.17964806
-0.0885106 0.11713844 -0.14802186 0.04773027]
INFO:root:initializer name: conv2.bias
INFO:root:initializer with shape:[64]
INFO:root:initializer with type: <class ‘numpy.float32’>
INFO:root:initializer first 10 wights:[ 0.02602181 0.00118999 0.00190271 -0.00422467 -0.01745024 0.04086005
0.01714195 0.00828495 0.03176006 0.00428755]
INFO:root:initializer name: conv2.weight
INFO:root:initializer with shape:[64, 64, 3, 3]
INFO:root:initializer with type: <class ‘numpy.float32’>
INFO:root:initializer first 10 wights:[-0.00695022 0.03211054 0.03908887 0.0133714 -0.000227 -0.00840447
0.03178968 -0.02974798 0.01581161 -0.01668321]
INFO:root:initializer name: conv3.bias
INFO:root:initializer with shape:[64]
INFO:root:initializer with type: <class ‘numpy.float32’>
INFO:root:initializer first 10 wights:[-0.18881151 -0.32025716 -0.0840579 0.23001562 0.3900114 -0.4994867
0.45777866 0.42241076 0.40811017 0.32145265]
INFO:root:initializer name: conv3.weight
INFO:root:initializer with shape:[64, 3, 1, 1]
INFO:root:initializer with type: <class ‘numpy.float32’>
INFO:root:initializer first 10 wights:[ 0.51609325 0.05624352 0.5465418 0.3039956 -0.11096366 -0.3982828
-0.4997819 0.19012491 -0.05253417 -0.09907632]
INFO:root:initializer name: conv4.bias
INFO:root:initializer with shape:[64]
INFO:root:initializer with type: <class ‘numpy.float32’>
INFO:root:initializer first 10 wights:[ 0.02621709 0.00735469 0.01134209 -0.02597461 0.03276851 0.00755459
0.02107707 -0.00700061 0.01922615 0.03978326]
INFO:root:initializer name: conv4.weight
INFO:root:initializer with shape:[64, 64, 3, 3]
INFO:root:initializer with type: <class ‘numpy.float32’>
INFO:root:initializer first 10 wights:[ 0.01240561 0.00703304 0.00477142 0.00702672 0.00661598 0.01905426
-0.00150001 0.01970465 -0.03535433 0.0147273 ]
INFO:root:initializer name: conv5.bias
INFO:root:initializer with shape:[64]
INFO:root:initializer with type: <class ‘numpy.float32’>
INFO:root:initializer first 10 wights:[-0.03412882 -0.01711171 0.00250292 -0.02079247 -0.01510924 0.03716683
-0.02143056 -0.03259912 -0.00563147 0.01869656]
INFO:root:initializer name: conv5.weight
INFO:root:initializer with shape:[64, 64, 3, 3]
INFO:root:initializer with type: <class ‘numpy.float32’>
INFO:root:initializer first 10 wights:[ 0.0076859 0.0182974 0.02970468 -0.03959073 0.0202815 0.01744279
0.02319324 -0.01895014 0.01619028 0.01473094]
INFO:root:node name:Conv_0
INFO:root:node op type:Conv
INFO:root:node with 3 inputs:[‘input.1’, ‘conv1.weight’, ‘conv1.bias’]
INFO:root:node with 1 outputs:[‘11’]
INFO:root:dilations with value:[1, 1]
INFO:root:group with value:None
INFO:root:kernel_shape with value:[3, 3]
INFO:root:pads with value:[1, 1, 1, 1]
INFO:root:strides with value:[1, 1]
INFO:root:node name:Relu_1
INFO:root:node op type:Relu
INFO:root:node with 1 inputs:[‘11’]
INFO:root:node with 1 outputs:[‘12’]
INFO:root:node name:Conv_2
INFO:root:node op type:Conv
INFO:root:node with 3 inputs:[‘12’, ‘conv2.weight’, ‘conv2.bias’]
INFO:root:node with 1 outputs:[‘13’]
INFO:root:dilations with value:[1, 1]
INFO:root:group with value:None
INFO:root:kernel_shape with value:[3, 3]
INFO:root:pads with value:[1, 1, 1, 1]
INFO:root:strides with value:[1, 1]
INFO:root:node name:Relu_3
INFO:root:node op type:Relu
INFO:root:node with 1 inputs:[‘13’]
INFO:root:node with 1 outputs:[‘14’]
INFO:root:node name:Conv_4
INFO:root:node op type:Conv
INFO:root:node with 3 inputs:[‘input.1’, ‘conv3.weight’, ‘conv3.bias’]
INFO:root:node with 1 outputs:[‘15’]
INFO:root:dilations with value:[1, 1]
INFO:root:group with value:None
INFO:root:kernel_shape with value:[1, 1]
INFO:root:pads with value:[0, 0, 0, 0]
INFO:root:strides with value:[1, 1]
INFO:root:node name:Relu_5
INFO:root:node op type:Relu
INFO:root:node with 1 inputs:[‘15’]
INFO:root:node with 1 outputs:[‘16’]
INFO:root:node name:Add_6
INFO:root:node op type:Add
INFO:root:node with 2 inputs:[‘14’, ‘16’]
INFO:root:node with 1 outputs:[‘17’]
INFO:root:node name:Conv_7
INFO:root:node op type:Conv
INFO:root:node with 3 inputs:[‘17’, ‘conv4.weight’, ‘conv4.bias’]
INFO:root:node with 1 outputs:[‘18’]
INFO:root:dilations with value:[1, 1]
INFO:root:group with value:None
INFO:root:kernel_shape with value:[3, 3]
INFO:root:pads with value:[1, 1, 1, 1]
INFO:root:strides with value:[1, 1]
INFO:root:node name:Relu_8
INFO:root:node op type:Relu
INFO:root:node with 1 inputs:[‘18’]
INFO:root:node with 1 outputs:[‘19’]
INFO:root:node name:Conv_9
INFO:root:node op type:Conv
INFO:root:node with 3 inputs:[‘19’, ‘conv5.weight’, ‘conv5.bias’]
INFO:root:node with 1 outputs:[‘20’]
INFO:root:dilations with value:[1, 1]
INFO:root:group with value:None
INFO:root:kernel_shape with value:[3, 3]
INFO:root:pads with value:[1, 1, 1, 1]
INFO:root:strides with value:[1, 1]
INFO:root:node name:Relu_10
INFO:root:node op type:Relu
INFO:root:node with 1 inputs:[‘20’]
INFO:root:node with 1 outputs:[‘21’]
INFO:root:node name:Add_11
INFO:root:node op type:Add
INFO:root:node with 2 inputs:[‘21’, ‘17’]
INFO:root:node with 1 outputs:[‘22’]
other
- onnx中定义的repeated变量,再python中可以直接用list,很方便解析
开放原子开发者工作坊旨在鼓励更多人参与开源活动,与志同道合的开发者们相互交流开发经验、分享开发心得、获取前沿技术趋势。工作坊有多种形式的开发者活动,如meetup、训练营等,主打技术交流,干货满满,真诚地邀请各位开发者共同参与!
更多推荐
所有评论(0)