Exporting a TensorFlow model

Learn how to export a Tensorflow model

Last updated 7th February, 2020.

Objective

Tensorflow is a popular machine learning library and SavedModel is a serialization format that is supported by OVHcloud ML Serving. This tutorial will cover how to export a Tensorflow trained model into an SavedModel file.

Requirements

Convert a simple model to SavedModel

ML Serving supports TensorFlow models thanks to the SavedModel serialization format of TensorFlow.

Let's take a simple example of a TensorFlow model to illustrate:

from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split

import tensorflow as tf

# Loading data
iris = load_iris()
X, y = iris.data, iris.target
X_train, X_test, y_train, y_test = train_test_split(X, y)

# Creating classifier
classifier = tf.estimator.DNNClassifier(
    feature_columns=[tf.feature_column.numeric_column('x', shape=[4])],
    hidden_units=[10, 20, 10],
    n_classes=3
)
# Training classifier
classifier.train(
    input_fn=tf.estimator.inputs.numpy_input_fn(
        x={'x': X_train},
        y=y_train,
        # batch_size=5,
        num_epochs=1000,
        shuffle=True
    ),
    steps=100
)

Define transformation function you need

If you want to export your data transformation along with your trained model, you should describe the mapping between them using a ServingInputReceiver.

INPUT_FEATURES = ['sepal_length', 'sepal_width', 'petal_length', 'petal_width']

def serving_input_receiver_fn():
    """
    This is used to define inputs to serve the model.
   :return: ServingInputReceiver
    """
    input_features = {feature: tf.placeholder(tf.float32, [None, 1]) for feature in INPUT_FEATURES}
    model_features = {'x', tf.concat([receiver_tensors[feature] for feature in INPUT_FEATURES]), axis=1)

    return tf.estimator.export.ServingInputReceiver(
        receiver_tensors=input_features,
        features=model_features
    )

In the previous example, four tensors are transformed:

  • sepal_length of shape [None, 1]
  • sepal_width of shape [None, 1]
  • petal_length of shape [None, 1]
  • petal_width of shape [None, 1]

Into a single tensor x of shape [None, 4] before feeding the classifier. Note: The None value in a tensor represents a dimension of arbitrary length. For example, if you are using a classification algorithm, None can simply be the number of records you wish to classify.

Launch the conversion and save it into a file

The trained model conversion is made by calling the export_saved_model function on your classifier. This function takes two parameters:

  • export_dir_base: The path where to save the serialized model
  • serving_input_receiver_fn: The transformation function used at serving time to feed your input to the trained classifier.
classifier.export_saved_model(
    export_dir_base='<path/where/to/saved/serialized/model>',
    serving_input_receiver_fn=serving_input_receiver_fn
)

Going further


Did you find this guide useful?

Please feel free to give any suggestions in order to improve this documentation.

Whether your feedback is about images, content, or structure, please share it, so that we can improve it together.

Your support requests will not be processed via this form. To do this, please use the "Create a ticket" form.

Thank you. Your feedback has been received.


These guides might also interest you...

OVHcloud Community

Access your community space. Ask questions, search for information, post content, and interact with other OVHcloud Community members.

Discuss with the OVHcloud community