<em>Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template</em>
System information
You can collect some of this information using our environment capture script You can also obtain the TensorFlow version with python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
Describe the current behavior When I try to deploy my trained model using this function: def load_graph(frozen_graph_filename): with tf.gfile.GFile(frozen_graph_filename, "rb") as f: graph_def = tf.GraphDef() graph_def.ParseFromString(f.read())
with tf.Graph().as_default() as graph:
tf.import_graph_def(graph_def, name='prefix')
return graph
I get this error: google.protobuf.message.DecodeError: Error parsing message at the line that goes: graph_def.ParseFromString(f.read())
Describe the expected behavior For all my other trained models, this error doesn't pop up, and this method does work. However, this model is one I trained myself. I got it from the export folder the TF Object Detection API made in my folder.
Code to reproduce the issue def load_graph(frozen_graph_filename): with tf.gfile.GFile(frozen_graph_filename, "rb") as f: graph_def = tf.GraphDef() graph_def.ParseFromString(f.read())
with tf.Graph().as_default() as graph:
tf.import_graph_def(graph_def, name='prefix')
return graph
Provide a reproducible test case that is the bare minimum necessary to generate the problem.
Other info / logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
Traceback (most recent call last): File "/Users/spencerkraisler/Desktop/raccoon_tutorial/model_deploy.py", line 27, in <module> detection_graph = load_graph(GRAPH_PATH) File "/Users/spencerkraisler/Desktop/raccoon_tutorial/model_deploy.py", line 10, in load_graph graph_def.ParseFromString(f.read()) google.protobuf.message.DecodeError: Error parsing message
Here is a solution load the graph. It is working fine.
def load_model():
with tf.gfile.GFile(path, "rb") as f:
graph_def = tf.GraphDef()
graph_def.ParseFromString(f.read())
with tf.Graph().as_default() as graph:
tf.import_graph_def(graph_def, name="")
return graph
if __name__=='__main__':
path = D:/path/to/your/dot pb file
graph = load_model()
with tf.Session(graph=graph) as sess:
for op in graph.get_operations():
print(op.name)
Hope it helps.