Categories
Misc

Export a model for inference.

Hi, All,

I have written a script to export a pre-trained TensorFlow model
for inference. The inference code is for the code present at this
directory –https://github.com/sabarim/itis.

I took a reference from the Deeplab export_model.py script to
write a similar one for this model.

Reference script link:
https://github.com/tensorflow/models/blob/master/research/deeplab/export_model.py

My script:

https://projectcode1.s3-us-west-1.amazonaws.com/export_model.py

I am getting an error, when I try to run inference from the
saved model.

FailedPreconditionError: 2 root error(s) found.

(0) Failed precondition: Attempting to use uninitialized value
decoder/feature_projection0/BatchNorm/moving_variance [[{{node
decoder/feature_projection0/BatchNorm/moving_variance/read}}]]
[[SemanticPredictions/_13]] (1) Failed precondition: Attempting to
use uninitialized value
decoder/feature_projection0/BatchNorm/moving_variance [[{{node
decoder/feature_projection0/BatchNorm/moving_variance/read}}]] 0
successful operations. 0 derived errors ignored.

Could anyone please take a look and help me understand the
problem.

submitted by /u/DamanpKaur

[visit reddit]

[comments]

Leave a Reply

Your email address will not be published. Required fields are marked *