Useful or not, from you.
tensorflow Cannot deploy trained model: google.protobuf.message.DecodeError: Error parsing message

<em>Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template</em>

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow):
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Mac OS High Sierra
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:
  • TensorFlow installed from (source or binary): Source
  • TensorFlow version (use command below): 1.18
  • Python version: 3.6.7
  • Bazel version (if compiling from source):
  • GCC/Compiler version (if compiling from source):
  • CUDA/cuDNN version:
  • GPU model and memory: No GPU

You can collect some of this information using our environment capture script You can also obtain the TensorFlow version with python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"

Describe the current behavior When I try to deploy my trained model using this function: def load_graph(frozen_graph_filename): with tf.gfile.GFile(frozen_graph_filename, "rb") as f: graph_def = tf.GraphDef() graph_def.ParseFromString(f.read())

with tf.Graph().as_default() as graph:
	tf.import_graph_def(graph_def, name='prefix')
return graph

I get this error: google.protobuf.message.DecodeError: Error parsing message at the line that goes: graph_def.ParseFromString(f.read())

Describe the expected behavior For all my other trained models, this error doesn't pop up, and this method does work. However, this model is one I trained myself. I got it from the export folder the TF Object Detection API made in my folder.

Code to reproduce the issue def load_graph(frozen_graph_filename): with tf.gfile.GFile(frozen_graph_filename, "rb") as f: graph_def = tf.GraphDef() graph_def.ParseFromString(f.read())

with tf.Graph().as_default() as graph:
	tf.import_graph_def(graph_def, name='prefix')
return graph

Provide a reproducible test case that is the bare minimum necessary to generate the problem.

Other info / logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.

Traceback (most recent call last): File "/Users/spencerkraisler/Desktop/raccoon_tutorial/model_deploy.py", line 27, in <module> detection_graph = load_graph(GRAPH_PATH) File "/Users/spencerkraisler/Desktop/raccoon_tutorial/model_deploy.py", line 10, in load_graph graph_def.ParseFromString(f.read()) google.protobuf.message.DecodeError: Error parsing message

That's a useful answer
Without any help

Here is a solution load the graph. It is working fine.

def load_model():
    with tf.gfile.GFile(path, "rb") as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())

    with tf.Graph().as_default() as graph:
        tf.import_graph_def(graph_def, name="")
    return graph

if __name__=='__main__':
    path = D:/path/to/your/dot pb file
    graph = load_model()
    with tf.Session(graph=graph) as sess:
        for op in graph.get_operations():
            print(op.name)

Hope it helps.