Newb Question: How to host and load Tensorflow Models (as a directory) in the Cloud?

We have a Tensorflow workflow and model that works great when used in a local environment (Python) – however, we now need to push it to production (Heroku). So we’re thinking we need to move our model into some type of Cloud hosting.

If possible, I’d like to upload the model directory (not an H5 file) to a cloud service/storage provider and then load that model into Tensorflow.

Here is how we’re currently loading in a model, and what we’d like to be able to do:

# Current setup loads model from local directory dnn_model = tf.keras.models.load_model('./neural_network/true_overall) # We'd like to be able to load the model from a cloud service/storage dnn_model = tf.keras.models.load_model(' 

Downloading the directory and running it from a temp directory isn’t an option with our setup – so we’ll need to be able to run the model from the cloud. We don’t necessarily need to “train” the model in the cloud, we just need to be able to load it.

I’ve looked into some things like TensorServe and TensorCloud, but I’m not 100% sure if thats what we need (we’re super new to Tensorflow and AI in general).

What’s the best way to get the models (as a directory) into the cloud so we can load them into our code?

submitted by /u/jengl
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *