OVH Guides

Exporting a TensorFlow model

Learn how to export a Tensorflow model

Last updated 7th February, 2020.


Tensorflow is a popular machine learning library and SavedModel is a serialization format that is supported by OVHcloud Serving Engine. This tutorial will cover how to export a Tensorflow trained model into an SavedModel file.


Convert a simple model to SavedModel

Serving Engine supports TensorFlow models thanks to the SavedModel serialization format of TensorFlow.

Let\'s take a simple example of a TensorFlow model to illustrate:

from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split

import tensorflow as tf

# Loading data
iris = load_iris()
X, y =,
X_train, X_test, y_train, y_test = train_test_split(X, y)

# Creating classifier
classifier = tf.estimator.DNNClassifier(
    feature_columns=[tf.feature_column.numeric_column('x', shape=[4])],
    hidden_units=[10, 20, 10],
# Training classifier
        x={'x': X_train},
        # batch_size=5,

Define transformation function you need

If you want to export your data transformation along with your trained model, you should describe the mapping between them using a ServingInputReceiver.

INPUT_FEATURES = ['sepal_length', 'sepal_width', 'petal_length', 'petal_width']

def serving_input_receiver_fn():
    This is used to define inputs to serve the model.
   :return: ServingInputReciever
    input_features = {feature: tf.placeholder(tf.float32, [None, 1]) for feature in INPUT_FEATURES}
    model_features = {'x', tf.concat([receiver_tensors[feature] for feature in INPUT_FEATURES]), axis=1)

    return tf.estimator.export.ServingInputReceiver(

In the previous example, four tensors are transformed:

  • sepal_length of shape [None, 1]
  • sepal_width of shape [None, 1]
  • petal_length of shape [None, 1]
  • petal_width of shape [None, 1]

Into a single tensor x of shape [None, 4] before feeding the classifier. Note: The None value in a tensor represents a dimension of arbitrary length. For example, if you are using a classification algorithm, None can simply be the number of records you wish to classify.

Launch the conversion and save it into a file

The trained model conversion is made by calling the export_saved_model function on your classifier. This function takes two parameters:

  • export_dir_base: The path where to save the serialized model
  • serving_input_receiver_fn: The transformation function used at serving time to feed your input to the trained classifier.

Going further

These guides might also interest you...