Categories
Misc

Tensorflow 1.14, Fix : “google.protobuf.message.DecodeError”: Error parsing message

Protobuf v3.15 Error: google.protobuf.message.DecodeError, When using tf.graph(), loading TensorFlow model into memory. After changing tf.graph() snippet above into TensorFlow v2, same error was getting.

I have tried protobuf 3.12.4(same on colabs), same error appeared

https://stackoverflow.com/questions/66842689/tensorflow-1-14-fix-google-protobuf-message-decodeerror-error-parsing-mess

Traceback (most recent call last): File "object_detection/webcam.py", line 25, in <module> od_graph_def.ParseFromString(serialized_graph) google.protobuf.message.DecodeError: Error parsing message [ WARN:0] global C:projectsopencv-pythonopencvmodulesvideoiosrccap_msmf.cpp (674) SourceReaderCB::~SourceReaderCB terminating async callback 

I have reinstalled different protobuf version and still same error is getting.

I have trained a “SSD MobileNet” model using TensorFlow version 1.14 CPU for Webcam Object-detection with OpenCV. After installing required libraries of TensorFlow, I run model_builder_tf1.py and it successfully passed all 21 tests.

Snippet: to load TensorFlow model into memory using tf.graph()

detection_graph = tf.Graph() with detection_graph.as_default(): od_graph_def = tf.compat.v1.GraphDef() with tf.gfile.GFile(PATH_TO_FROZEN_GRAPH, 'rb') as fid: serialized_graph = fid.read() od_graph_def.ParseFromString(serialized_graph) tf.import_graph_def(od_graph_def, name='') sess = tf.compat.v1.Session(graph=detection_graph) 

Note that TensorFlow 1.14 is installed on conda environment.

Using protobuf==3.8, another of error appeared

AttributeError: module ‘google.protobuf.descriptor’ has no attribute ‘_internal_create_key

Can someone please give a solution to this problem.

submitted by /u/jhivesh
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *