LostTech.TensorFlow : API Documentation

Type tf.saved_model

Namespace tensorflow

Public static methods

object build_signature_def(IDictionary<object, object> inputs, IDictionary<object, object> outputs, string method_name)

Utility function to build a SignatureDef protocol buffer.
Parameters
IDictionary<object, object> inputs
Inputs of the SignatureDef defined as a proto map of string to tensor info.
IDictionary<object, object> outputs
Outputs of the SignatureDef defined as a proto map of string to tensor info.
string method_name
Method name of the SignatureDef as a string.
Returns
object
A SignatureDef protocol buffer constructed based on the supplied arguments.

object build_signature_def_dyn(object inputs, object outputs, object method_name)

Utility function to build a SignatureDef protocol buffer.
Parameters
object inputs
Inputs of the SignatureDef defined as a proto map of string to tensor info.
object outputs
Outputs of the SignatureDef defined as a proto map of string to tensor info.
object method_name
Method name of the SignatureDef as a string.
Returns
object
A SignatureDef protocol buffer constructed based on the supplied arguments.

object build_tensor_info(IGraphNodeBase tensor)

Utility function to build TensorInfo proto from a Tensor. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info.
Parameters
IGraphNodeBase tensor
Tensor or SparseTensor whose name, dtype and shape are used to build the TensorInfo. For SparseTensors, the names of the three constituent Tensors are used.
Returns
object
A TensorInfo protocol buffer constructed based on the supplied argument.

object build_tensor_info(string tensor)

Utility function to build TensorInfo proto from a Tensor. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info.
Parameters
string tensor
Tensor or SparseTensor whose name, dtype and shape are used to build the TensorInfo. For SparseTensors, the names of the three constituent Tensors are used.
Returns
object
A TensorInfo protocol buffer constructed based on the supplied argument.

object build_tensor_info(IEnumerable<object> tensor)

Utility function to build TensorInfo proto from a Tensor. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info.
Parameters
IEnumerable<object> tensor
Tensor or SparseTensor whose name, dtype and shape are used to build the TensorInfo. For SparseTensors, the names of the three constituent Tensors are used.
Returns
object
A TensorInfo protocol buffer constructed based on the supplied argument.

object build_tensor_info(RaggedTensor tensor)

Utility function to build TensorInfo proto from a Tensor. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info.
Parameters
RaggedTensor tensor
Tensor or SparseTensor whose name, dtype and shape are used to build the TensorInfo. For SparseTensors, the names of the three constituent Tensors are used.
Returns
object
A TensorInfo protocol buffer constructed based on the supplied argument.

object build_tensor_info(int tensor)

Utility function to build TensorInfo proto from a Tensor. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info.
Parameters
int tensor
Tensor or SparseTensor whose name, dtype and shape are used to build the TensorInfo. For SparseTensors, the names of the three constituent Tensors are used.
Returns
object
A TensorInfo protocol buffer constructed based on the supplied argument.

object build_tensor_info_dyn(object tensor)

Utility function to build TensorInfo proto from a Tensor. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info.
Parameters
object tensor
Tensor or SparseTensor whose name, dtype and shape are used to build the TensorInfo. For SparseTensors, the names of the three constituent Tensors are used.
Returns
object
A TensorInfo protocol buffer constructed based on the supplied argument.

object classification_signature_def(IGraphNodeBase examples, IGraphNodeBase classes, IGraphNodeBase scores)

Creates classification signature from given examples and predictions.

This function produces signatures intended for use with the TensorFlow Serving Classify API (tensorflow_serving/apis/prediction_service.proto), and so constrains the input and output types to those allowed by TensorFlow Serving.
Parameters
IGraphNodeBase examples
A string `Tensor`, expected to accept serialized tf.Examples.
IGraphNodeBase classes
A string `Tensor`. Note that the ClassificationResponse message requires that class labels are strings, not integers or anything else.
IGraphNodeBase scores
a float `Tensor`.
Returns
object
A classification-flavored signature_def.

object classification_signature_def(IGraphNodeBase examples, IGraphNodeBase classes, string scores)

Creates classification signature from given examples and predictions.

This function produces signatures intended for use with the TensorFlow Serving Classify API (tensorflow_serving/apis/prediction_service.proto), and so constrains the input and output types to those allowed by TensorFlow Serving.
Parameters
IGraphNodeBase examples
A string `Tensor`, expected to accept serialized tf.Examples.
IGraphNodeBase classes
A string `Tensor`. Note that the ClassificationResponse message requires that class labels are strings, not integers or anything else.
string scores
a float `Tensor`.
Returns
object
A classification-flavored signature_def.

object classification_signature_def_dyn(object examples, object classes, object scores)

Creates classification signature from given examples and predictions.

This function produces signatures intended for use with the TensorFlow Serving Classify API (tensorflow_serving/apis/prediction_service.proto), and so constrains the input and output types to those allowed by TensorFlow Serving.
Parameters
object examples
A string `Tensor`, expected to accept serialized tf.Examples.
object classes
A string `Tensor`. Note that the ClassificationResponse message requires that class labels are strings, not integers or anything else.
object scores
a float `Tensor`.
Returns
object
A classification-flavored signature_def.

bool contains_saved_model(string export_dir)

Checks whether the provided export directory could contain a SavedModel.

Note that the method does not load any data by itself. If the method returns `false`, the export directory definitely does not contain a SavedModel. If the method returns `true`, the export directory may contain a SavedModel but provides no guarantee that it can be loaded.
Parameters
string export_dir
Absolute string path to possible export location. For example, '/my/foo/model'.
Returns
bool
True if the export directory contains SavedModel files, False otherwise.

SparseTensor get_tensor_from_tensor_info(object tensor_info, Graph graph, string import_scope)

Returns the Tensor or CompositeTensor described by a TensorInfo proto. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.get_tensor_from_tensor_info or tf.compat.v1.saved_model.get_tensor_from_tensor_info.
Parameters
object tensor_info
A TensorInfo proto describing a Tensor or SparseTensor or CompositeTensor.
Graph graph
The tf.Graph in which tensors are looked up. If None, the current default graph is used.
string import_scope
If not None, names in `tensor_info` are prefixed with this string before lookup.
Returns
SparseTensor
The Tensor or SparseTensor or CompositeTensor in `graph` described by `tensor_info`.

object get_tensor_from_tensor_info_dyn(object tensor_info, object graph, object import_scope)

Returns the Tensor or CompositeTensor described by a TensorInfo proto. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.get_tensor_from_tensor_info or tf.compat.v1.saved_model.get_tensor_from_tensor_info.
Parameters
object tensor_info
A TensorInfo proto describing a Tensor or SparseTensor or CompositeTensor.
object graph
The tf.Graph in which tensors are looked up. If None, the current default graph is used.
object import_scope
If not None, names in `tensor_info` are prefixed with this string before lookup.
Returns
object
The Tensor or SparseTensor or CompositeTensor in `graph` described by `tensor_info`.

bool is_valid_signature(object signature_def)

Determine whether a SignatureDef can be served by TensorFlow Serving.

object is_valid_signature_dyn(object signature_def)

Determine whether a SignatureDef can be served by TensorFlow Serving.

AutoTrackable load(Session sess, IEnumerable<string> tags, string export_dir, string import_scope, IDictionary<string, object> saver_kwargs)

Loads the model from a SavedModel as specified by tags. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. There will be a new function for importing SavedModels in Tensorflow 2.0.
Parameters
Session sess
The TensorFlow session to restore the variables.
IEnumerable<string> tags
Set of string tags to identify the required MetaGraphDef. These should correspond to the tags used when saving the variables using the SavedModel `save()` API.
string export_dir
Directory in which the SavedModel protocol buffer and variables to be loaded are located.
string import_scope
Optional `string` -- if specified, prepend this string followed by '/' to all loaded tensor names. This scope is applied to tensor instances loaded into the passed session, but it is *not* written through to the static `MetaGraphDef` protocol buffer that is returned.
IDictionary<string, object> saver_kwargs
Optional keyword arguments passed through to Saver.
Returns
AutoTrackable
The `MetaGraphDef` protocol buffer loaded in the provided session. This can be used to further extract signature-defs, collection-defs, etc.

AutoTrackable load(Session sess, IEnumerable<string> tags, Byte[] export_dir, string import_scope, IDictionary<string, object> saver_kwargs)

Loads the model from a SavedModel as specified by tags. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. There will be a new function for importing SavedModels in Tensorflow 2.0.
Parameters
Session sess
The TensorFlow session to restore the variables.
IEnumerable<string> tags
Set of string tags to identify the required MetaGraphDef. These should correspond to the tags used when saving the variables using the SavedModel `save()` API.
Byte[] export_dir
Directory in which the SavedModel protocol buffer and variables to be loaded are located.
string import_scope
Optional `string` -- if specified, prepend this string followed by '/' to all loaded tensor names. This scope is applied to tensor instances loaded into the passed session, but it is *not* written through to the static `MetaGraphDef` protocol buffer that is returned.
IDictionary<string, object> saver_kwargs
Optional keyword arguments passed through to Saver.
Returns
AutoTrackable
The `MetaGraphDef` protocol buffer loaded in the provided session. This can be used to further extract signature-defs, collection-defs, etc.

AutoTrackable load(_MonitoredSession sess, IEnumerable<string> tags, string export_dir, string import_scope, IDictionary<string, object> saver_kwargs)

Loads the model from a SavedModel as specified by tags. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. There will be a new function for importing SavedModels in Tensorflow 2.0.
Parameters
_MonitoredSession sess
The TensorFlow session to restore the variables.
IEnumerable<string> tags
Set of string tags to identify the required MetaGraphDef. These should correspond to the tags used when saving the variables using the SavedModel `save()` API.
string export_dir
Directory in which the SavedModel protocol buffer and variables to be loaded are located.
string import_scope
Optional `string` -- if specified, prepend this string followed by '/' to all loaded tensor names. This scope is applied to tensor instances loaded into the passed session, but it is *not* written through to the static `MetaGraphDef` protocol buffer that is returned.
IDictionary<string, object> saver_kwargs
Optional keyword arguments passed through to Saver.
Returns
AutoTrackable
The `MetaGraphDef` protocol buffer loaded in the provided session. This can be used to further extract signature-defs, collection-defs, etc.

AutoTrackable load(_MonitoredSession sess, IEnumerable<string> tags, Byte[] export_dir, string import_scope, IDictionary<string, object> saver_kwargs)

Loads the model from a SavedModel as specified by tags. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. There will be a new function for importing SavedModels in Tensorflow 2.0.
Parameters
_MonitoredSession sess
The TensorFlow session to restore the variables.
IEnumerable<string> tags
Set of string tags to identify the required MetaGraphDef. These should correspond to the tags used when saving the variables using the SavedModel `save()` API.
Byte[] export_dir
Directory in which the SavedModel protocol buffer and variables to be loaded are located.
string import_scope
Optional `string` -- if specified, prepend this string followed by '/' to all loaded tensor names. This scope is applied to tensor instances loaded into the passed session, but it is *not* written through to the static `MetaGraphDef` protocol buffer that is returned.
IDictionary<string, object> saver_kwargs
Optional keyword arguments passed through to Saver.
Returns
AutoTrackable
The `MetaGraphDef` protocol buffer loaded in the provided session. This can be used to further extract signature-defs, collection-defs, etc.

AutoTrackable load(LocalCLIDebugWrapperSession sess, IEnumerable<string> tags, string export_dir, string import_scope, IDictionary<string, object> saver_kwargs)

Loads the model from a SavedModel as specified by tags. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. There will be a new function for importing SavedModels in Tensorflow 2.0.
Parameters
LocalCLIDebugWrapperSession sess
The TensorFlow session to restore the variables.
IEnumerable<string> tags
Set of string tags to identify the required MetaGraphDef. These should correspond to the tags used when saving the variables using the SavedModel `save()` API.
string export_dir
Directory in which the SavedModel protocol buffer and variables to be loaded are located.
string import_scope
Optional `string` -- if specified, prepend this string followed by '/' to all loaded tensor names. This scope is applied to tensor instances loaded into the passed session, but it is *not* written through to the static `MetaGraphDef` protocol buffer that is returned.
IDictionary<string, object> saver_kwargs
Optional keyword arguments passed through to Saver.
Returns
AutoTrackable
The `MetaGraphDef` protocol buffer loaded in the provided session. This can be used to further extract signature-defs, collection-defs, etc.

AutoTrackable load(LocalCLIDebugWrapperSession sess, IEnumerable<string> tags, Byte[] export_dir, string import_scope, IDictionary<string, object> saver_kwargs)

Loads the model from a SavedModel as specified by tags. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. There will be a new function for importing SavedModels in Tensorflow 2.0.
Parameters
LocalCLIDebugWrapperSession sess
The TensorFlow session to restore the variables.
IEnumerable<string> tags
Set of string tags to identify the required MetaGraphDef. These should correspond to the tags used when saving the variables using the SavedModel `save()` API.
Byte[] export_dir
Directory in which the SavedModel protocol buffer and variables to be loaded are located.
string import_scope
Optional `string` -- if specified, prepend this string followed by '/' to all loaded tensor names. This scope is applied to tensor instances loaded into the passed session, but it is *not* written through to the static `MetaGraphDef` protocol buffer that is returned.
IDictionary<string, object> saver_kwargs
Optional keyword arguments passed through to Saver.
Returns
AutoTrackable
The `MetaGraphDef` protocol buffer loaded in the provided session. This can be used to further extract signature-defs, collection-defs, etc.

object load_dyn(object sess, object tags, object export_dir, object import_scope, IDictionary<string, object> saver_kwargs)

Loads the model from a SavedModel as specified by tags. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. There will be a new function for importing SavedModels in Tensorflow 2.0.
Parameters
object sess
The TensorFlow session to restore the variables.
object tags
Set of string tags to identify the required MetaGraphDef. These should correspond to the tags used when saving the variables using the SavedModel `save()` API.
object export_dir
Directory in which the SavedModel protocol buffer and variables to be loaded are located.
object import_scope
Optional `string` -- if specified, prepend this string followed by '/' to all loaded tensor names. This scope is applied to tensor instances loaded into the passed session, but it is *not* written through to the static `MetaGraphDef` protocol buffer that is returned.
IDictionary<string, object> saver_kwargs
Optional keyword arguments passed through to Saver.
Returns
object
The `MetaGraphDef` protocol buffer loaded in the provided session. This can be used to further extract signature-defs, collection-defs, etc.

object load_v2(string export_dir, string tags)

Load a SavedModel from `export_dir`.

Signatures associated with the SavedModel are available as functions: Objects exported with tf.saved_model.save additionally have trackable objects and functions assigned to attributes: _Loading Keras models_

Keras models are trackable, so they can be saved to SavedModel. The object returned by tf.saved_model.load is not a Keras object (i.e. doesn't have `.fit`, `.predict`, etc. methods). A few attributes and functions are still available: `.variables`, `.trainable_variables` and `.__call__`. Use tf.keras.models.load_model to restore the Keras model.

_Importing SavedModels from TensorFlow 1.x_

SavedModels from tf.estimator.Estimator or 1.x SavedModel APIs have a flat graph instead of tf.function objects. These SavedModels will have functions corresponding to their signatures in the `.signatures` attribute, but also have a `.prune` method which allows you to extract functions for new subgraphs. This is equivalent to importing the SavedModel and naming feeds and fetches in a Session from TensorFlow 1.x. See `tf.compat.v1.wrap_function` for details. These SavedModels also have a `.variables` attribute containing imported variables, and a `.graph` attribute representing the whole imported graph. For SavedModels exported from tf.saved_model.save, variables are instead assigned to whichever attributes they were assigned before export.
Parameters
string export_dir
The SavedModel directory to load from.
string tags
A tag or sequence of tags identifying the MetaGraph to load. Optional if the SavedModel contains a single MetaGraph, as for those exported from tf.saved_model.load.
Returns
object
A trackable object with a `signatures` attribute mapping from signature keys to functions. If the SavedModel was exported by tf.saved_model.load, it also points to trackable objects and functions which were attached to the exported object.
Show Example
imported = tf.saved_model.load(path)
            f = imported.signatures["serving_default"]
            print(f(x=tf.constant([[1.]]))) 

object load_v2(string export_dir, IDictionary<string, IEnumerable<int>> tags)

Load a SavedModel from `export_dir`.

Signatures associated with the SavedModel are available as functions: Objects exported with tf.saved_model.save additionally have trackable objects and functions assigned to attributes: _Loading Keras models_

Keras models are trackable, so they can be saved to SavedModel. The object returned by tf.saved_model.load is not a Keras object (i.e. doesn't have `.fit`, `.predict`, etc. methods). A few attributes and functions are still available: `.variables`, `.trainable_variables` and `.__call__`. Use tf.keras.models.load_model to restore the Keras model.

_Importing SavedModels from TensorFlow 1.x_

SavedModels from tf.estimator.Estimator or 1.x SavedModel APIs have a flat graph instead of tf.function objects. These SavedModels will have functions corresponding to their signatures in the `.signatures` attribute, but also have a `.prune` method which allows you to extract functions for new subgraphs. This is equivalent to importing the SavedModel and naming feeds and fetches in a Session from TensorFlow 1.x. See `tf.compat.v1.wrap_function` for details. These SavedModels also have a `.variables` attribute containing imported variables, and a `.graph` attribute representing the whole imported graph. For SavedModels exported from tf.saved_model.save, variables are instead assigned to whichever attributes they were assigned before export.
Parameters
string export_dir
The SavedModel directory to load from.
IDictionary<string, IEnumerable<int>> tags
A tag or sequence of tags identifying the MetaGraph to load. Optional if the SavedModel contains a single MetaGraph, as for those exported from tf.saved_model.load.
Returns
object
A trackable object with a `signatures` attribute mapping from signature keys to functions. If the SavedModel was exported by tf.saved_model.load, it also points to trackable objects and functions which were attached to the exported object.
Show Example
imported = tf.saved_model.load(path)
            f = imported.signatures["serving_default"]
            print(f(x=tf.constant([[1.]]))) 

object load_v2(Byte[] export_dir, IDictionary<string, IEnumerable<int>> tags)

Load a SavedModel from `export_dir`.

Signatures associated with the SavedModel are available as functions: Objects exported with tf.saved_model.save additionally have trackable objects and functions assigned to attributes: _Loading Keras models_

Keras models are trackable, so they can be saved to SavedModel. The object returned by tf.saved_model.load is not a Keras object (i.e. doesn't have `.fit`, `.predict`, etc. methods). A few attributes and functions are still available: `.variables`, `.trainable_variables` and `.__call__`. Use tf.keras.models.load_model to restore the Keras model.

_Importing SavedModels from TensorFlow 1.x_

SavedModels from tf.estimator.Estimator or 1.x SavedModel APIs have a flat graph instead of tf.function objects. These SavedModels will have functions corresponding to their signatures in the `.signatures` attribute, but also have a `.prune` method which allows you to extract functions for new subgraphs. This is equivalent to importing the SavedModel and naming feeds and fetches in a Session from TensorFlow 1.x. See `tf.compat.v1.wrap_function` for details. These SavedModels also have a `.variables` attribute containing imported variables, and a `.graph` attribute representing the whole imported graph. For SavedModels exported from tf.saved_model.save, variables are instead assigned to whichever attributes they were assigned before export.
Parameters
Byte[] export_dir
The SavedModel directory to load from.
IDictionary<string, IEnumerable<int>> tags
A tag or sequence of tags identifying the MetaGraph to load. Optional if the SavedModel contains a single MetaGraph, as for those exported from tf.saved_model.load.
Returns
object
A trackable object with a `signatures` attribute mapping from signature keys to functions. If the SavedModel was exported by tf.saved_model.load, it also points to trackable objects and functions which were attached to the exported object.
Show Example
imported = tf.saved_model.load(path)
            f = imported.signatures["serving_default"]
            print(f(x=tf.constant([[1.]]))) 

object load_v2(Byte[] export_dir, IEnumerable<string> tags)

Load a SavedModel from `export_dir`.

Signatures associated with the SavedModel are available as functions: Objects exported with tf.saved_model.save additionally have trackable objects and functions assigned to attributes: _Loading Keras models_

Keras models are trackable, so they can be saved to SavedModel. The object returned by tf.saved_model.load is not a Keras object (i.e. doesn't have `.fit`, `.predict`, etc. methods). A few attributes and functions are still available: `.variables`, `.trainable_variables` and `.__call__`. Use tf.keras.models.load_model to restore the Keras model.

_Importing SavedModels from TensorFlow 1.x_

SavedModels from tf.estimator.Estimator or 1.x SavedModel APIs have a flat graph instead of tf.function objects. These SavedModels will have functions corresponding to their signatures in the `.signatures` attribute, but also have a `.prune` method which allows you to extract functions for new subgraphs. This is equivalent to importing the SavedModel and naming feeds and fetches in a Session from TensorFlow 1.x. See `tf.compat.v1.wrap_function` for details. These SavedModels also have a `.variables` attribute containing imported variables, and a `.graph` attribute representing the whole imported graph. For SavedModels exported from tf.saved_model.save, variables are instead assigned to whichever attributes they were assigned before export.
Parameters
Byte[] export_dir
The SavedModel directory to load from.
IEnumerable<string> tags
A tag or sequence of tags identifying the MetaGraph to load. Optional if the SavedModel contains a single MetaGraph, as for those exported from tf.saved_model.load.
Returns
object
A trackable object with a `signatures` attribute mapping from signature keys to functions. If the SavedModel was exported by tf.saved_model.load, it also points to trackable objects and functions which were attached to the exported object.
Show Example
imported = tf.saved_model.load(path)
            f = imported.signatures["serving_default"]
            print(f(x=tf.constant([[1.]]))) 

object load_v2(Byte[] export_dir, string tags)

Load a SavedModel from `export_dir`.

Signatures associated with the SavedModel are available as functions: Objects exported with tf.saved_model.save additionally have trackable objects and functions assigned to attributes: _Loading Keras models_

Keras models are trackable, so they can be saved to SavedModel. The object returned by tf.saved_model.load is not a Keras object (i.e. doesn't have `.fit`, `.predict`, etc. methods). A few attributes and functions are still available: `.variables`, `.trainable_variables` and `.__call__`. Use tf.keras.models.load_model to restore the Keras model.

_Importing SavedModels from TensorFlow 1.x_

SavedModels from tf.estimator.Estimator or 1.x SavedModel APIs have a flat graph instead of tf.function objects. These SavedModels will have functions corresponding to their signatures in the `.signatures` attribute, but also have a `.prune` method which allows you to extract functions for new subgraphs. This is equivalent to importing the SavedModel and naming feeds and fetches in a Session from TensorFlow 1.x. See `tf.compat.v1.wrap_function` for details. These SavedModels also have a `.variables` attribute containing imported variables, and a `.graph` attribute representing the whole imported graph. For SavedModels exported from tf.saved_model.save, variables are instead assigned to whichever attributes they were assigned before export.
Parameters
Byte[] export_dir
The SavedModel directory to load from.
string tags
A tag or sequence of tags identifying the MetaGraph to load. Optional if the SavedModel contains a single MetaGraph, as for those exported from tf.saved_model.load.
Returns
object
A trackable object with a `signatures` attribute mapping from signature keys to functions. If the SavedModel was exported by tf.saved_model.load, it also points to trackable objects and functions which were attached to the exported object.
Show Example
imported = tf.saved_model.load(path)
            f = imported.signatures["serving_default"]
            print(f(x=tf.constant([[1.]]))) 

object load_v2(string export_dir, IEnumerable<string> tags)

Load a SavedModel from `export_dir`.

Signatures associated with the SavedModel are available as functions: Objects exported with tf.saved_model.save additionally have trackable objects and functions assigned to attributes: _Loading Keras models_

Keras models are trackable, so they can be saved to SavedModel. The object returned by tf.saved_model.load is not a Keras object (i.e. doesn't have `.fit`, `.predict`, etc. methods). A few attributes and functions are still available: `.variables`, `.trainable_variables` and `.__call__`. Use tf.keras.models.load_model to restore the Keras model.

_Importing SavedModels from TensorFlow 1.x_

SavedModels from tf.estimator.Estimator or 1.x SavedModel APIs have a flat graph instead of tf.function objects. These SavedModels will have functions corresponding to their signatures in the `.signatures` attribute, but also have a `.prune` method which allows you to extract functions for new subgraphs. This is equivalent to importing the SavedModel and naming feeds and fetches in a Session from TensorFlow 1.x. See `tf.compat.v1.wrap_function` for details. These SavedModels also have a `.variables` attribute containing imported variables, and a `.graph` attribute representing the whole imported graph. For SavedModels exported from tf.saved_model.save, variables are instead assigned to whichever attributes they were assigned before export.
Parameters
string export_dir
The SavedModel directory to load from.
IEnumerable<string> tags
A tag or sequence of tags identifying the MetaGraph to load. Optional if the SavedModel contains a single MetaGraph, as for those exported from tf.saved_model.load.
Returns
object
A trackable object with a `signatures` attribute mapping from signature keys to functions. If the SavedModel was exported by tf.saved_model.load, it also points to trackable objects and functions which were attached to the exported object.
Show Example
imported = tf.saved_model.load(path)
            f = imported.signatures["serving_default"]
            print(f(x=tf.constant([[1.]]))) 

object load_v2_dyn(object export_dir, object tags)

Load a SavedModel from `export_dir`.

Signatures associated with the SavedModel are available as functions: Objects exported with tf.saved_model.save additionally have trackable objects and functions assigned to attributes: _Loading Keras models_

Keras models are trackable, so they can be saved to SavedModel. The object returned by tf.saved_model.load is not a Keras object (i.e. doesn't have `.fit`, `.predict`, etc. methods). A few attributes and functions are still available: `.variables`, `.trainable_variables` and `.__call__`. Use tf.keras.models.load_model to restore the Keras model.

_Importing SavedModels from TensorFlow 1.x_

SavedModels from tf.estimator.Estimator or 1.x SavedModel APIs have a flat graph instead of tf.function objects. These SavedModels will have functions corresponding to their signatures in the `.signatures` attribute, but also have a `.prune` method which allows you to extract functions for new subgraphs. This is equivalent to importing the SavedModel and naming feeds and fetches in a Session from TensorFlow 1.x. See `tf.compat.v1.wrap_function` for details. These SavedModels also have a `.variables` attribute containing imported variables, and a `.graph` attribute representing the whole imported graph. For SavedModels exported from tf.saved_model.save, variables are instead assigned to whichever attributes they were assigned before export.
Parameters
object export_dir
The SavedModel directory to load from.
object tags
A tag or sequence of tags identifying the MetaGraph to load. Optional if the SavedModel contains a single MetaGraph, as for those exported from tf.saved_model.load.
Returns
object
A trackable object with a `signatures` attribute mapping from signature keys to functions. If the SavedModel was exported by tf.saved_model.load, it also points to trackable objects and functions which were attached to the exported object.
Show Example
imported = tf.saved_model.load(path)
            f = imported.signatures["serving_default"]
            print(f(x=tf.constant([[1.]]))) 

object main_op_with_restore(object restore_op_name)

Returns a main op to init variables, tables and restore the graph. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.main_op_with_restore or tf.compat.v1.saved_model.main_op.main_op_with_restore.

Returns the main op including the group of ops that initializes all variables, initialize local variables, initialize all tables and the restore op name.
Parameters
object restore_op_name
Name of the op to use to restore the graph.
Returns
object
The set of ops to be run as part of the main op upon the load operation.

object main_op_with_restore_dyn(object restore_op_name)

Returns a main op to init variables, tables and restore the graph. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.main_op_with_restore or tf.compat.v1.saved_model.main_op.main_op_with_restore.

Returns the main op including the group of ops that initializes all variables, initialize local variables, initialize all tables and the restore op name.
Parameters
object restore_op_name
Name of the op to use to restore the graph.
Returns
object
The set of ops to be run as part of the main op upon the load operation.

object predict_signature_def(IDictionary<string, object> inputs, IDictionary<string, object> outputs)

Creates prediction signature from given inputs and outputs.

This function produces signatures intended for use with the TensorFlow Serving Predict API (tensorflow_serving/apis/prediction_service.proto). This API imposes no constraints on the input and output types.
Parameters
IDictionary<string, object> inputs
dict of string to `Tensor`.
IDictionary<string, object> outputs
dict of string to `Tensor`.
Returns
object
A prediction-flavored signature_def.

object predict_signature_def_dyn(object inputs, object outputs)

Creates prediction signature from given inputs and outputs.

This function produces signatures intended for use with the TensorFlow Serving Predict API (tensorflow_serving/apis/prediction_service.proto). This API imposes no constraints on the input and output types.
Parameters
object inputs
dict of string to `Tensor`.
object outputs
dict of string to `Tensor`.
Returns
object
A prediction-flavored signature_def.

object regression_signature_def(IGraphNodeBase examples, IGraphNodeBase predictions)

Creates regression signature from given examples and predictions.

This function produces signatures intended for use with the TensorFlow Serving Regress API (tensorflow_serving/apis/prediction_service.proto), and so constrains the input and output types to those allowed by TensorFlow Serving.
Parameters
IGraphNodeBase examples
A string `Tensor`, expected to accept serialized tf.Examples.
IGraphNodeBase predictions
A float `Tensor`.
Returns
object
A regression-flavored signature_def.

object regression_signature_def_dyn(object examples, object predictions)

Creates regression signature from given examples and predictions.

This function produces signatures intended for use with the TensorFlow Serving Regress API (tensorflow_serving/apis/prediction_service.proto), and so constrains the input and output types to those allowed by TensorFlow Serving.
Parameters
object examples
A string `Tensor`, expected to accept serialized tf.Examples.
object predictions
A float `Tensor`.
Returns
object
A regression-flavored signature_def.

void save(object obj, IGraphNodeBase export_dir, object signatures)

Exports the Trackable object `obj` to [SavedModel format](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).

Example usage: The resulting SavedModel is then servable with an input named "x", its value having any shape and dtype float32.

The optional `signatures` argument controls which methods in `obj` will be available to programs which consume `SavedModel`s, for example serving APIs. Python functions may be decorated with `@tf.function(input_signature=...)` and passed as signatures directly, or lazily with a call to `get_concrete_function` on the method decorated with `@tf.function`.

If the `signatures` argument is omitted, `obj` will be searched for `@tf.function`-decorated methods. If exactly one `@tf.function` is found, that method will be used as the default signature for the SavedModel. This behavior is expected to change in the future, when a corresponding tf.saved_model.load symbol is added. At that point signatures will be completely optional, and any `@tf.function` attached to `obj` or its dependencies will be exported for use with `load`.

When invoking a signature in an exported SavedModel, `Tensor` arguments are identified by name. These names will come from the Python function's argument names by default. They may be overridden by specifying a `name=...` argument in the corresponding tf.TensorSpec object. Explicit naming is required if multiple `Tensor`s are passed through a single argument to the Python function.

The outputs of functions used as `signatures` must either be flat lists, in which case outputs will be numbered, or a dictionary mapping string keys to `Tensor`, in which case the keys will be used to name outputs.

Signatures are available in objects returned by tf.saved_model.load as a `.signatures` attribute. This is a reserved attribute: tf.saved_model.save on an object with a custom `.signatures` attribute will raise an exception.

Since tf.keras.Model objects are also Trackable, this function can be used to export Keras models. For example, exporting with a signature specified: Exporting from a function without a fixed signature: tf.keras.Model instances constructed from inputs and outputs already have a signature and so do not require a `@tf.function` decorator or a `signatures` argument. If neither are specified, the model's forward pass is exported. Variables must be tracked by assigning them to an attribute of a tracked object or to an attribute of `obj` directly. TensorFlow objects (e.g. layers from tf.keras.layers, optimizers from tf.train) track their variables automatically. This is the same tracking scheme that tf.train.Checkpoint uses, and an exported `Checkpoint` object may be restored as a training checkpoint by pointing tf.train.Checkpoint.restore to the SavedModel's "variables/" subdirectory. Currently variables are the only stateful objects supported by tf.saved_model.save, but others (e.g. tables) will be supported in the future.

tf.function does not hard-code device annotations from outside the function body, instead using the calling context's device. This means for example that exporting a model which runs on a GPU and serving it on a CPU will generally work, with some exceptions. tf.device annotations inside the body of the function will be hard-coded in the exported model; this type of annotation is discouraged. Device-specific operations, e.g. with "cuDNN" in the name or with device-specific layouts, may cause issues. Currently a `DistributionStrategy` is another exception: active distribution strategies will cause device placements to be hard-coded in a function. Exporting a single-device computation and importing under a `DistributionStrategy` is not currently supported, but may be in the future.

SavedModels exported with tf.saved_model.save [strip default-valued attributes](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md#stripping-default-valued-attributes) automatically, which removes one source of incompatibilities when the consumer of a SavedModel is running an older TensorFlow version than the producer. There are however other sources of incompatibilities which are not handled automatically, such as when the exported model contains operations which the consumer does not have definitions for.
Parameters
object obj
A trackable object to export.
IGraphNodeBase export_dir
A directory in which to write the SavedModel.
object signatures
Optional, either a tf.function with an input signature specified or the result of `f.get_concrete_function` on a `@tf.function`-decorated function `f`, in which case `f` will be used to generate a signature for the SavedModel under the default serving signature key. `signatures` may also be a dictionary, in which case it maps from signature keys to either tf.function instances with input signatures or concrete functions. The keys of such a dictionary may be arbitrary strings, but will typically be from the tf.saved_model.signature_constants module.
Show Example
class Adder(tf.Module): 

@tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)]) def add(self, x): return x + x + 1.

to_export = Adder() tf.saved_model.save(to_export, '/tmp/adder')

void save(object obj, Byte[] export_dir, IDictionary<string, object> signatures)

Exports the Trackable object `obj` to [SavedModel format](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).

Example usage: The resulting SavedModel is then servable with an input named "x", its value having any shape and dtype float32.

The optional `signatures` argument controls which methods in `obj` will be available to programs which consume `SavedModel`s, for example serving APIs. Python functions may be decorated with `@tf.function(input_signature=...)` and passed as signatures directly, or lazily with a call to `get_concrete_function` on the method decorated with `@tf.function`.

If the `signatures` argument is omitted, `obj` will be searched for `@tf.function`-decorated methods. If exactly one `@tf.function` is found, that method will be used as the default signature for the SavedModel. This behavior is expected to change in the future, when a corresponding tf.saved_model.load symbol is added. At that point signatures will be completely optional, and any `@tf.function` attached to `obj` or its dependencies will be exported for use with `load`.

When invoking a signature in an exported SavedModel, `Tensor` arguments are identified by name. These names will come from the Python function's argument names by default. They may be overridden by specifying a `name=...` argument in the corresponding tf.TensorSpec object. Explicit naming is required if multiple `Tensor`s are passed through a single argument to the Python function.

The outputs of functions used as `signatures` must either be flat lists, in which case outputs will be numbered, or a dictionary mapping string keys to `Tensor`, in which case the keys will be used to name outputs.

Signatures are available in objects returned by tf.saved_model.load as a `.signatures` attribute. This is a reserved attribute: tf.saved_model.save on an object with a custom `.signatures` attribute will raise an exception.

Since tf.keras.Model objects are also Trackable, this function can be used to export Keras models. For example, exporting with a signature specified: Exporting from a function without a fixed signature: tf.keras.Model instances constructed from inputs and outputs already have a signature and so do not require a `@tf.function` decorator or a `signatures` argument. If neither are specified, the model's forward pass is exported. Variables must be tracked by assigning them to an attribute of a tracked object or to an attribute of `obj` directly. TensorFlow objects (e.g. layers from tf.keras.layers, optimizers from tf.train) track their variables automatically. This is the same tracking scheme that tf.train.Checkpoint uses, and an exported `Checkpoint` object may be restored as a training checkpoint by pointing tf.train.Checkpoint.restore to the SavedModel's "variables/" subdirectory. Currently variables are the only stateful objects supported by tf.saved_model.save, but others (e.g. tables) will be supported in the future.

tf.function does not hard-code device annotations from outside the function body, instead using the calling context's device. This means for example that exporting a model which runs on a GPU and serving it on a CPU will generally work, with some exceptions. tf.device annotations inside the body of the function will be hard-coded in the exported model; this type of annotation is discouraged. Device-specific operations, e.g. with "cuDNN" in the name or with device-specific layouts, may cause issues. Currently a `DistributionStrategy` is another exception: active distribution strategies will cause device placements to be hard-coded in a function. Exporting a single-device computation and importing under a `DistributionStrategy` is not currently supported, but may be in the future.

SavedModels exported with tf.saved_model.save [strip default-valued attributes](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md#stripping-default-valued-attributes) automatically, which removes one source of incompatibilities when the consumer of a SavedModel is running an older TensorFlow version than the producer. There are however other sources of incompatibilities which are not handled automatically, such as when the exported model contains operations which the consumer does not have definitions for.
Parameters
object obj
A trackable object to export.
Byte[] export_dir
A directory in which to write the SavedModel.
IDictionary<string, object> signatures
Optional, either a tf.function with an input signature specified or the result of `f.get_concrete_function` on a `@tf.function`-decorated function `f`, in which case `f` will be used to generate a signature for the SavedModel under the default serving signature key. `signatures` may also be a dictionary, in which case it maps from signature keys to either tf.function instances with input signatures or concrete functions. The keys of such a dictionary may be arbitrary strings, but will typically be from the tf.saved_model.signature_constants module.
Show Example
class Adder(tf.Module): 

@tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)]) def add(self, x): return x + x + 1.

to_export = Adder() tf.saved_model.save(to_export, '/tmp/adder')

void save(object obj, Byte[] export_dir, PythonFunctionContainer signatures)

Exports the Trackable object `obj` to [SavedModel format](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).

Example usage: The resulting SavedModel is then servable with an input named "x", its value having any shape and dtype float32.

The optional `signatures` argument controls which methods in `obj` will be available to programs which consume `SavedModel`s, for example serving APIs. Python functions may be decorated with `@tf.function(input_signature=...)` and passed as signatures directly, or lazily with a call to `get_concrete_function` on the method decorated with `@tf.function`.

If the `signatures` argument is omitted, `obj` will be searched for `@tf.function`-decorated methods. If exactly one `@tf.function` is found, that method will be used as the default signature for the SavedModel. This behavior is expected to change in the future, when a corresponding tf.saved_model.load symbol is added. At that point signatures will be completely optional, and any `@tf.function` attached to `obj` or its dependencies will be exported for use with `load`.

When invoking a signature in an exported SavedModel, `Tensor` arguments are identified by name. These names will come from the Python function's argument names by default. They may be overridden by specifying a `name=...` argument in the corresponding tf.TensorSpec object. Explicit naming is required if multiple `Tensor`s are passed through a single argument to the Python function.

The outputs of functions used as `signatures` must either be flat lists, in which case outputs will be numbered, or a dictionary mapping string keys to `Tensor`, in which case the keys will be used to name outputs.

Signatures are available in objects returned by tf.saved_model.load as a `.signatures` attribute. This is a reserved attribute: tf.saved_model.save on an object with a custom `.signatures` attribute will raise an exception.

Since tf.keras.Model objects are also Trackable, this function can be used to export Keras models. For example, exporting with a signature specified: Exporting from a function without a fixed signature: tf.keras.Model instances constructed from inputs and outputs already have a signature and so do not require a `@tf.function` decorator or a `signatures` argument. If neither are specified, the model's forward pass is exported. Variables must be tracked by assigning them to an attribute of a tracked object or to an attribute of `obj` directly. TensorFlow objects (e.g. layers from tf.keras.layers, optimizers from tf.train) track their variables automatically. This is the same tracking scheme that tf.train.Checkpoint uses, and an exported `Checkpoint` object may be restored as a training checkpoint by pointing tf.train.Checkpoint.restore to the SavedModel's "variables/" subdirectory. Currently variables are the only stateful objects supported by tf.saved_model.save, but others (e.g. tables) will be supported in the future.

tf.function does not hard-code device annotations from outside the function body, instead using the calling context's device. This means for example that exporting a model which runs on a GPU and serving it on a CPU will generally work, with some exceptions. tf.device annotations inside the body of the function will be hard-coded in the exported model; this type of annotation is discouraged. Device-specific operations, e.g. with "cuDNN" in the name or with device-specific layouts, may cause issues. Currently a `DistributionStrategy` is another exception: active distribution strategies will cause device placements to be hard-coded in a function. Exporting a single-device computation and importing under a `DistributionStrategy` is not currently supported, but may be in the future.

SavedModels exported with tf.saved_model.save [strip default-valued attributes](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md#stripping-default-valued-attributes) automatically, which removes one source of incompatibilities when the consumer of a SavedModel is running an older TensorFlow version than the producer. There are however other sources of incompatibilities which are not handled automatically, such as when the exported model contains operations which the consumer does not have definitions for.
Parameters
object obj
A trackable object to export.
Byte[] export_dir
A directory in which to write the SavedModel.
PythonFunctionContainer signatures
Optional, either a tf.function with an input signature specified or the result of `f.get_concrete_function` on a `@tf.function`-decorated function `f`, in which case `f` will be used to generate a signature for the SavedModel under the default serving signature key. `signatures` may also be a dictionary, in which case it maps from signature keys to either tf.function instances with input signatures or concrete functions. The keys of such a dictionary may be arbitrary strings, but will typically be from the tf.saved_model.signature_constants module.
Show Example
class Adder(tf.Module): 

@tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)]) def add(self, x): return x + x + 1.

to_export = Adder() tf.saved_model.save(to_export, '/tmp/adder')

void save(object obj, Byte[] export_dir, object signatures)

Exports the Trackable object `obj` to [SavedModel format](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).

Example usage: The resulting SavedModel is then servable with an input named "x", its value having any shape and dtype float32.

The optional `signatures` argument controls which methods in `obj` will be available to programs which consume `SavedModel`s, for example serving APIs. Python functions may be decorated with `@tf.function(input_signature=...)` and passed as signatures directly, or lazily with a call to `get_concrete_function` on the method decorated with `@tf.function`.

If the `signatures` argument is omitted, `obj` will be searched for `@tf.function`-decorated methods. If exactly one `@tf.function` is found, that method will be used as the default signature for the SavedModel. This behavior is expected to change in the future, when a corresponding tf.saved_model.load symbol is added. At that point signatures will be completely optional, and any `@tf.function` attached to `obj` or its dependencies will be exported for use with `load`.

When invoking a signature in an exported SavedModel, `Tensor` arguments are identified by name. These names will come from the Python function's argument names by default. They may be overridden by specifying a `name=...` argument in the corresponding tf.TensorSpec object. Explicit naming is required if multiple `Tensor`s are passed through a single argument to the Python function.

The outputs of functions used as `signatures` must either be flat lists, in which case outputs will be numbered, or a dictionary mapping string keys to `Tensor`, in which case the keys will be used to name outputs.

Signatures are available in objects returned by tf.saved_model.load as a `.signatures` attribute. This is a reserved attribute: tf.saved_model.save on an object with a custom `.signatures` attribute will raise an exception.

Since tf.keras.Model objects are also Trackable, this function can be used to export Keras models. For example, exporting with a signature specified: Exporting from a function without a fixed signature: tf.keras.Model instances constructed from inputs and outputs already have a signature and so do not require a `@tf.function` decorator or a `signatures` argument. If neither are specified, the model's forward pass is exported. Variables must be tracked by assigning them to an attribute of a tracked object or to an attribute of `obj` directly. TensorFlow objects (e.g. layers from tf.keras.layers, optimizers from tf.train) track their variables automatically. This is the same tracking scheme that tf.train.Checkpoint uses, and an exported `Checkpoint` object may be restored as a training checkpoint by pointing tf.train.Checkpoint.restore to the SavedModel's "variables/" subdirectory. Currently variables are the only stateful objects supported by tf.saved_model.save, but others (e.g. tables) will be supported in the future.

tf.function does not hard-code device annotations from outside the function body, instead using the calling context's device. This means for example that exporting a model which runs on a GPU and serving it on a CPU will generally work, with some exceptions. tf.device annotations inside the body of the function will be hard-coded in the exported model; this type of annotation is discouraged. Device-specific operations, e.g. with "cuDNN" in the name or with device-specific layouts, may cause issues. Currently a `DistributionStrategy` is another exception: active distribution strategies will cause device placements to be hard-coded in a function. Exporting a single-device computation and importing under a `DistributionStrategy` is not currently supported, but may be in the future.

SavedModels exported with tf.saved_model.save [strip default-valued attributes](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md#stripping-default-valued-attributes) automatically, which removes one source of incompatibilities when the consumer of a SavedModel is running an older TensorFlow version than the producer. There are however other sources of incompatibilities which are not handled automatically, such as when the exported model contains operations which the consumer does not have definitions for.
Parameters
object obj
A trackable object to export.
Byte[] export_dir
A directory in which to write the SavedModel.
object signatures
Optional, either a tf.function with an input signature specified or the result of `f.get_concrete_function` on a `@tf.function`-decorated function `f`, in which case `f` will be used to generate a signature for the SavedModel under the default serving signature key. `signatures` may also be a dictionary, in which case it maps from signature keys to either tf.function instances with input signatures or concrete functions. The keys of such a dictionary may be arbitrary strings, but will typically be from the tf.saved_model.signature_constants module.
Show Example
class Adder(tf.Module): 

@tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)]) def add(self, x): return x + x + 1.

to_export = Adder() tf.saved_model.save(to_export, '/tmp/adder')

void save(object obj, IGraphNodeBase export_dir, IDictionary<string, object> signatures)

Exports the Trackable object `obj` to [SavedModel format](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).

Example usage: The resulting SavedModel is then servable with an input named "x", its value having any shape and dtype float32.

The optional `signatures` argument controls which methods in `obj` will be available to programs which consume `SavedModel`s, for example serving APIs. Python functions may be decorated with `@tf.function(input_signature=...)` and passed as signatures directly, or lazily with a call to `get_concrete_function` on the method decorated with `@tf.function`.

If the `signatures` argument is omitted, `obj` will be searched for `@tf.function`-decorated methods. If exactly one `@tf.function` is found, that method will be used as the default signature for the SavedModel. This behavior is expected to change in the future, when a corresponding tf.saved_model.load symbol is added. At that point signatures will be completely optional, and any `@tf.function` attached to `obj` or its dependencies will be exported for use with `load`.

When invoking a signature in an exported SavedModel, `Tensor` arguments are identified by name. These names will come from the Python function's argument names by default. They may be overridden by specifying a `name=...` argument in the corresponding tf.TensorSpec object. Explicit naming is required if multiple `Tensor`s are passed through a single argument to the Python function.

The outputs of functions used as `signatures` must either be flat lists, in which case outputs will be numbered, or a dictionary mapping string keys to `Tensor`, in which case the keys will be used to name outputs.

Signatures are available in objects returned by tf.saved_model.load as a `.signatures` attribute. This is a reserved attribute: tf.saved_model.save on an object with a custom `.signatures` attribute will raise an exception.

Since tf.keras.Model objects are also Trackable, this function can be used to export Keras models. For example, exporting with a signature specified: Exporting from a function without a fixed signature: tf.keras.Model instances constructed from inputs and outputs already have a signature and so do not require a `@tf.function` decorator or a `signatures` argument. If neither are specified, the model's forward pass is exported. Variables must be tracked by assigning them to an attribute of a tracked object or to an attribute of `obj` directly. TensorFlow objects (e.g. layers from tf.keras.layers, optimizers from tf.train) track their variables automatically. This is the same tracking scheme that tf.train.Checkpoint uses, and an exported `Checkpoint` object may be restored as a training checkpoint by pointing tf.train.Checkpoint.restore to the SavedModel's "variables/" subdirectory. Currently variables are the only stateful objects supported by tf.saved_model.save, but others (e.g. tables) will be supported in the future.

tf.function does not hard-code device annotations from outside the function body, instead using the calling context's device. This means for example that exporting a model which runs on a GPU and serving it on a CPU will generally work, with some exceptions. tf.device annotations inside the body of the function will be hard-coded in the exported model; this type of annotation is discouraged. Device-specific operations, e.g. with "cuDNN" in the name or with device-specific layouts, may cause issues. Currently a `DistributionStrategy` is another exception: active distribution strategies will cause device placements to be hard-coded in a function. Exporting a single-device computation and importing under a `DistributionStrategy` is not currently supported, but may be in the future.

SavedModels exported with tf.saved_model.save [strip default-valued attributes](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md#stripping-default-valued-attributes) automatically, which removes one source of incompatibilities when the consumer of a SavedModel is running an older TensorFlow version than the producer. There are however other sources of incompatibilities which are not handled automatically, such as when the exported model contains operations which the consumer does not have definitions for.
Parameters
object obj
A trackable object to export.
IGraphNodeBase export_dir
A directory in which to write the SavedModel.
IDictionary<string, object> signatures
Optional, either a tf.function with an input signature specified or the result of `f.get_concrete_function` on a `@tf.function`-decorated function `f`, in which case `f` will be used to generate a signature for the SavedModel under the default serving signature key. `signatures` may also be a dictionary, in which case it maps from signature keys to either tf.function instances with input signatures or concrete functions. The keys of such a dictionary may be arbitrary strings, but will typically be from the tf.saved_model.signature_constants module.
Show Example
class Adder(tf.Module): 

@tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)]) def add(self, x): return x + x + 1.

to_export = Adder() tf.saved_model.save(to_export, '/tmp/adder')

void save(object obj, IGraphNodeBase export_dir, PythonFunctionContainer signatures)

Exports the Trackable object `obj` to [SavedModel format](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).

Example usage: The resulting SavedModel is then servable with an input named "x", its value having any shape and dtype float32.

The optional `signatures` argument controls which methods in `obj` will be available to programs which consume `SavedModel`s, for example serving APIs. Python functions may be decorated with `@tf.function(input_signature=...)` and passed as signatures directly, or lazily with a call to `get_concrete_function` on the method decorated with `@tf.function`.

If the `signatures` argument is omitted, `obj` will be searched for `@tf.function`-decorated methods. If exactly one `@tf.function` is found, that method will be used as the default signature for the SavedModel. This behavior is expected to change in the future, when a corresponding tf.saved_model.load symbol is added. At that point signatures will be completely optional, and any `@tf.function` attached to `obj` or its dependencies will be exported for use with `load`.

When invoking a signature in an exported SavedModel, `Tensor` arguments are identified by name. These names will come from the Python function's argument names by default. They may be overridden by specifying a `name=...` argument in the corresponding tf.TensorSpec object. Explicit naming is required if multiple `Tensor`s are passed through a single argument to the Python function.

The outputs of functions used as `signatures` must either be flat lists, in which case outputs will be numbered, or a dictionary mapping string keys to `Tensor`, in which case the keys will be used to name outputs.

Signatures are available in objects returned by tf.saved_model.load as a `.signatures` attribute. This is a reserved attribute: tf.saved_model.save on an object with a custom `.signatures` attribute will raise an exception.

Since tf.keras.Model objects are also Trackable, this function can be used to export Keras models. For example, exporting with a signature specified: Exporting from a function without a fixed signature: tf.keras.Model instances constructed from inputs and outputs already have a signature and so do not require a `@tf.function` decorator or a `signatures` argument. If neither are specified, the model's forward pass is exported. Variables must be tracked by assigning them to an attribute of a tracked object or to an attribute of `obj` directly. TensorFlow objects (e.g. layers from tf.keras.layers, optimizers from tf.train) track their variables automatically. This is the same tracking scheme that tf.train.Checkpoint uses, and an exported `Checkpoint` object may be restored as a training checkpoint by pointing tf.train.Checkpoint.restore to the SavedModel's "variables/" subdirectory. Currently variables are the only stateful objects supported by tf.saved_model.save, but others (e.g. tables) will be supported in the future.

tf.function does not hard-code device annotations from outside the function body, instead using the calling context's device. This means for example that exporting a model which runs on a GPU and serving it on a CPU will generally work, with some exceptions. tf.device annotations inside the body of the function will be hard-coded in the exported model; this type of annotation is discouraged. Device-specific operations, e.g. with "cuDNN" in the name or with device-specific layouts, may cause issues. Currently a `DistributionStrategy` is another exception: active distribution strategies will cause device placements to be hard-coded in a function. Exporting a single-device computation and importing under a `DistributionStrategy` is not currently supported, but may be in the future.

SavedModels exported with tf.saved_model.save [strip default-valued attributes](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md#stripping-default-valued-attributes) automatically, which removes one source of incompatibilities when the consumer of a SavedModel is running an older TensorFlow version than the producer. There are however other sources of incompatibilities which are not handled automatically, such as when the exported model contains operations which the consumer does not have definitions for.
Parameters
object obj
A trackable object to export.
IGraphNodeBase export_dir
A directory in which to write the SavedModel.
PythonFunctionContainer signatures
Optional, either a tf.function with an input signature specified or the result of `f.get_concrete_function` on a `@tf.function`-decorated function `f`, in which case `f` will be used to generate a signature for the SavedModel under the default serving signature key. `signatures` may also be a dictionary, in which case it maps from signature keys to either tf.function instances with input signatures or concrete functions. The keys of such a dictionary may be arbitrary strings, but will typically be from the tf.saved_model.signature_constants module.
Show Example
class Adder(tf.Module): 

@tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)]) def add(self, x): return x + x + 1.

to_export = Adder() tf.saved_model.save(to_export, '/tmp/adder')

void save(object obj, string export_dir, object signatures)

Exports the Trackable object `obj` to [SavedModel format](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).

Example usage: The resulting SavedModel is then servable with an input named "x", its value having any shape and dtype float32.

The optional `signatures` argument controls which methods in `obj` will be available to programs which consume `SavedModel`s, for example serving APIs. Python functions may be decorated with `@tf.function(input_signature=...)` and passed as signatures directly, or lazily with a call to `get_concrete_function` on the method decorated with `@tf.function`.

If the `signatures` argument is omitted, `obj` will be searched for `@tf.function`-decorated methods. If exactly one `@tf.function` is found, that method will be used as the default signature for the SavedModel. This behavior is expected to change in the future, when a corresponding tf.saved_model.load symbol is added. At that point signatures will be completely optional, and any `@tf.function` attached to `obj` or its dependencies will be exported for use with `load`.

When invoking a signature in an exported SavedModel, `Tensor` arguments are identified by name. These names will come from the Python function's argument names by default. They may be overridden by specifying a `name=...` argument in the corresponding tf.TensorSpec object. Explicit naming is required if multiple `Tensor`s are passed through a single argument to the Python function.

The outputs of functions used as `signatures` must either be flat lists, in which case outputs will be numbered, or a dictionary mapping string keys to `Tensor`, in which case the keys will be used to name outputs.

Signatures are available in objects returned by tf.saved_model.load as a `.signatures` attribute. This is a reserved attribute: tf.saved_model.save on an object with a custom `.signatures` attribute will raise an exception.

Since tf.keras.Model objects are also Trackable, this function can be used to export Keras models. For example, exporting with a signature specified: Exporting from a function without a fixed signature: tf.keras.Model instances constructed from inputs and outputs already have a signature and so do not require a `@tf.function` decorator or a `signatures` argument. If neither are specified, the model's forward pass is exported. Variables must be tracked by assigning them to an attribute of a tracked object or to an attribute of `obj` directly. TensorFlow objects (e.g. layers from tf.keras.layers, optimizers from tf.train) track their variables automatically. This is the same tracking scheme that tf.train.Checkpoint uses, and an exported `Checkpoint` object may be restored as a training checkpoint by pointing tf.train.Checkpoint.restore to the SavedModel's "variables/" subdirectory. Currently variables are the only stateful objects supported by tf.saved_model.save, but others (e.g. tables) will be supported in the future.

tf.function does not hard-code device annotations from outside the function body, instead using the calling context's device. This means for example that exporting a model which runs on a GPU and serving it on a CPU will generally work, with some exceptions. tf.device annotations inside the body of the function will be hard-coded in the exported model; this type of annotation is discouraged. Device-specific operations, e.g. with "cuDNN" in the name or with device-specific layouts, may cause issues. Currently a `DistributionStrategy` is another exception: active distribution strategies will cause device placements to be hard-coded in a function. Exporting a single-device computation and importing under a `DistributionStrategy` is not currently supported, but may be in the future.

SavedModels exported with tf.saved_model.save [strip default-valued attributes](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md#stripping-default-valued-attributes) automatically, which removes one source of incompatibilities when the consumer of a SavedModel is running an older TensorFlow version than the producer. There are however other sources of incompatibilities which are not handled automatically, such as when the exported model contains operations which the consumer does not have definitions for.
Parameters
object obj
A trackable object to export.
string export_dir
A directory in which to write the SavedModel.
object signatures
Optional, either a tf.function with an input signature specified or the result of `f.get_concrete_function` on a `@tf.function`-decorated function `f`, in which case `f` will be used to generate a signature for the SavedModel under the default serving signature key. `signatures` may also be a dictionary, in which case it maps from signature keys to either tf.function instances with input signatures or concrete functions. The keys of such a dictionary may be arbitrary strings, but will typically be from the tf.saved_model.signature_constants module.
Show Example
class Adder(tf.Module): 

@tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)]) def add(self, x): return x + x + 1.

to_export = Adder() tf.saved_model.save(to_export, '/tmp/adder')

void save(object obj, string export_dir, IDictionary<string, object> signatures)

Exports the Trackable object `obj` to [SavedModel format](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).

Example usage: The resulting SavedModel is then servable with an input named "x", its value having any shape and dtype float32.

The optional `signatures` argument controls which methods in `obj` will be available to programs which consume `SavedModel`s, for example serving APIs. Python functions may be decorated with `@tf.function(input_signature=...)` and passed as signatures directly, or lazily with a call to `get_concrete_function` on the method decorated with `@tf.function`.

If the `signatures` argument is omitted, `obj` will be searched for `@tf.function`-decorated methods. If exactly one `@tf.function` is found, that method will be used as the default signature for the SavedModel. This behavior is expected to change in the future, when a corresponding tf.saved_model.load symbol is added. At that point signatures will be completely optional, and any `@tf.function` attached to `obj` or its dependencies will be exported for use with `load`.

When invoking a signature in an exported SavedModel, `Tensor` arguments are identified by name. These names will come from the Python function's argument names by default. They may be overridden by specifying a `name=...` argument in the corresponding tf.TensorSpec object. Explicit naming is required if multiple `Tensor`s are passed through a single argument to the Python function.

The outputs of functions used as `signatures` must either be flat lists, in which case outputs will be numbered, or a dictionary mapping string keys to `Tensor`, in which case the keys will be used to name outputs.

Signatures are available in objects returned by tf.saved_model.load as a `.signatures` attribute. This is a reserved attribute: tf.saved_model.save on an object with a custom `.signatures` attribute will raise an exception.

Since tf.keras.Model objects are also Trackable, this function can be used to export Keras models. For example, exporting with a signature specified: Exporting from a function without a fixed signature: tf.keras.Model instances constructed from inputs and outputs already have a signature and so do not require a `@tf.function` decorator or a `signatures` argument. If neither are specified, the model's forward pass is exported. Variables must be tracked by assigning them to an attribute of a tracked object or to an attribute of `obj` directly. TensorFlow objects (e.g. layers from tf.keras.layers, optimizers from tf.train) track their variables automatically. This is the same tracking scheme that tf.train.Checkpoint uses, and an exported `Checkpoint` object may be restored as a training checkpoint by pointing tf.train.Checkpoint.restore to the SavedModel's "variables/" subdirectory. Currently variables are the only stateful objects supported by tf.saved_model.save, but others (e.g. tables) will be supported in the future.

tf.function does not hard-code device annotations from outside the function body, instead using the calling context's device. This means for example that exporting a model which runs on a GPU and serving it on a CPU will generally work, with some exceptions. tf.device annotations inside the body of the function will be hard-coded in the exported model; this type of annotation is discouraged. Device-specific operations, e.g. with "cuDNN" in the name or with device-specific layouts, may cause issues. Currently a `DistributionStrategy` is another exception: active distribution strategies will cause device placements to be hard-coded in a function. Exporting a single-device computation and importing under a `DistributionStrategy` is not currently supported, but may be in the future.

SavedModels exported with tf.saved_model.save [strip default-valued attributes](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md#stripping-default-valued-attributes) automatically, which removes one source of incompatibilities when the consumer of a SavedModel is running an older TensorFlow version than the producer. There are however other sources of incompatibilities which are not handled automatically, such as when the exported model contains operations which the consumer does not have definitions for.
Parameters
object obj
A trackable object to export.
string export_dir
A directory in which to write the SavedModel.
IDictionary<string, object> signatures
Optional, either a tf.function with an input signature specified or the result of `f.get_concrete_function` on a `@tf.function`-decorated function `f`, in which case `f` will be used to generate a signature for the SavedModel under the default serving signature key. `signatures` may also be a dictionary, in which case it maps from signature keys to either tf.function instances with input signatures or concrete functions. The keys of such a dictionary may be arbitrary strings, but will typically be from the tf.saved_model.signature_constants module.
Show Example
class Adder(tf.Module): 

@tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)]) def add(self, x): return x + x + 1.

to_export = Adder() tf.saved_model.save(to_export, '/tmp/adder')

void save(object obj, string export_dir, PythonFunctionContainer signatures)

Exports the Trackable object `obj` to [SavedModel format](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).

Example usage: The resulting SavedModel is then servable with an input named "x", its value having any shape and dtype float32.

The optional `signatures` argument controls which methods in `obj` will be available to programs which consume `SavedModel`s, for example serving APIs. Python functions may be decorated with `@tf.function(input_signature=...)` and passed as signatures directly, or lazily with a call to `get_concrete_function` on the method decorated with `@tf.function`.

If the `signatures` argument is omitted, `obj` will be searched for `@tf.function`-decorated methods. If exactly one `@tf.function` is found, that method will be used as the default signature for the SavedModel. This behavior is expected to change in the future, when a corresponding tf.saved_model.load symbol is added. At that point signatures will be completely optional, and any `@tf.function` attached to `obj` or its dependencies will be exported for use with `load`.

When invoking a signature in an exported SavedModel, `Tensor` arguments are identified by name. These names will come from the Python function's argument names by default. They may be overridden by specifying a `name=...` argument in the corresponding tf.TensorSpec object. Explicit naming is required if multiple `Tensor`s are passed through a single argument to the Python function.

The outputs of functions used as `signatures` must either be flat lists, in which case outputs will be numbered, or a dictionary mapping string keys to `Tensor`, in which case the keys will be used to name outputs.

Signatures are available in objects returned by tf.saved_model.load as a `.signatures` attribute. This is a reserved attribute: tf.saved_model.save on an object with a custom `.signatures` attribute will raise an exception.

Since tf.keras.Model objects are also Trackable, this function can be used to export Keras models. For example, exporting with a signature specified: Exporting from a function without a fixed signature: tf.keras.Model instances constructed from inputs and outputs already have a signature and so do not require a `@tf.function` decorator or a `signatures` argument. If neither are specified, the model's forward pass is exported. Variables must be tracked by assigning them to an attribute of a tracked object or to an attribute of `obj` directly. TensorFlow objects (e.g. layers from tf.keras.layers, optimizers from tf.train) track their variables automatically. This is the same tracking scheme that tf.train.Checkpoint uses, and an exported `Checkpoint` object may be restored as a training checkpoint by pointing tf.train.Checkpoint.restore to the SavedModel's "variables/" subdirectory. Currently variables are the only stateful objects supported by tf.saved_model.save, but others (e.g. tables) will be supported in the future.

tf.function does not hard-code device annotations from outside the function body, instead using the calling context's device. This means for example that exporting a model which runs on a GPU and serving it on a CPU will generally work, with some exceptions. tf.device annotations inside the body of the function will be hard-coded in the exported model; this type of annotation is discouraged. Device-specific operations, e.g. with "cuDNN" in the name or with device-specific layouts, may cause issues. Currently a `DistributionStrategy` is another exception: active distribution strategies will cause device placements to be hard-coded in a function. Exporting a single-device computation and importing under a `DistributionStrategy` is not currently supported, but may be in the future.

SavedModels exported with tf.saved_model.save [strip default-valued attributes](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md#stripping-default-valued-attributes) automatically, which removes one source of incompatibilities when the consumer of a SavedModel is running an older TensorFlow version than the producer. There are however other sources of incompatibilities which are not handled automatically, such as when the exported model contains operations which the consumer does not have definitions for.
Parameters
object obj
A trackable object to export.
string export_dir
A directory in which to write the SavedModel.
PythonFunctionContainer signatures
Optional, either a tf.function with an input signature specified or the result of `f.get_concrete_function` on a `@tf.function`-decorated function `f`, in which case `f` will be used to generate a signature for the SavedModel under the default serving signature key. `signatures` may also be a dictionary, in which case it maps from signature keys to either tf.function instances with input signatures or concrete functions. The keys of such a dictionary may be arbitrary strings, but will typically be from the tf.saved_model.signature_constants module.
Show Example
class Adder(tf.Module): 

@tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)]) def add(self, x): return x + x + 1.

to_export = Adder() tf.saved_model.save(to_export, '/tmp/adder')

void simple_save(object session, object export_dir, IDictionary<string, object> inputs, IDictionary<string, object> outputs, Operation legacy_init_op)

Convenience function to build a SavedModel suitable for serving. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.simple_save.

In many common cases, saving models for serving will be as simple as:

simple_save(session, export_dir, inputs={"x": x, "y": y}, outputs={"z": z})

Although in many cases it's not necessary to understand all of the many ways to configure a SavedModel, this method has a few practical implications: - It will be treated as a graph for inference / serving (i.e. uses the tag `saved_model.SERVING`) - The SavedModel will load in TensorFlow Serving and supports the [Predict API](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/predict.proto). To use the Classify, Regress, or MultiInference APIs, please use either [tf.Estimator](https://www.tensorflow.org/api_docs/python/tf/estimator/Estimator) or the lower level [SavedModel APIs](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md). - Some TensorFlow ops depend on information on disk or other information called "assets". These are generally handled automatically by adding the assets to the `GraphKeys.ASSET_FILEPATHS` collection. Only assets in that collection are exported; if you need more custom behavior, you'll need to use the [SavedModelBuilder](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/builder.py).

More information about SavedModel and signatures can be found here: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md.
Parameters
object session
The TensorFlow session from which to save the meta graph and variables.
object export_dir
The path to which the SavedModel will be stored.
IDictionary<string, object> inputs
dict mapping string input names to tensors. These are added to the SignatureDef as the inputs.
IDictionary<string, object> outputs
dict mapping string output names to tensors. These are added to the SignatureDef as the outputs.
Operation legacy_init_op
Legacy support for op or group of ops to execute after the restore op upon a load.

void simple_save(object session, object export_dir, IDictionary<string, object> inputs, IDictionary<string, object> outputs, IGraphNodeBase legacy_init_op)

Convenience function to build a SavedModel suitable for serving. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.simple_save.

In many common cases, saving models for serving will be as simple as:

simple_save(session, export_dir, inputs={"x": x, "y": y}, outputs={"z": z})

Although in many cases it's not necessary to understand all of the many ways to configure a SavedModel, this method has a few practical implications: - It will be treated as a graph for inference / serving (i.e. uses the tag `saved_model.SERVING`) - The SavedModel will load in TensorFlow Serving and supports the [Predict API](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/predict.proto). To use the Classify, Regress, or MultiInference APIs, please use either [tf.Estimator](https://www.tensorflow.org/api_docs/python/tf/estimator/Estimator) or the lower level [SavedModel APIs](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md). - Some TensorFlow ops depend on information on disk or other information called "assets". These are generally handled automatically by adding the assets to the `GraphKeys.ASSET_FILEPATHS` collection. Only assets in that collection are exported; if you need more custom behavior, you'll need to use the [SavedModelBuilder](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/builder.py).

More information about SavedModel and signatures can be found here: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md.
Parameters
object session
The TensorFlow session from which to save the meta graph and variables.
object export_dir
The path to which the SavedModel will be stored.
IDictionary<string, object> inputs
dict mapping string input names to tensors. These are added to the SignatureDef as the inputs.
IDictionary<string, object> outputs
dict mapping string output names to tensors. These are added to the SignatureDef as the outputs.
IGraphNodeBase legacy_init_op
Legacy support for op or group of ops to execute after the restore op upon a load.

void simple_save(object session, object export_dir, IDictionary<string, object> inputs, IDictionary<string, object> outputs, object legacy_init_op)

Convenience function to build a SavedModel suitable for serving. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.simple_save.

In many common cases, saving models for serving will be as simple as:

simple_save(session, export_dir, inputs={"x": x, "y": y}, outputs={"z": z})

Although in many cases it's not necessary to understand all of the many ways to configure a SavedModel, this method has a few practical implications: - It will be treated as a graph for inference / serving (i.e. uses the tag `saved_model.SERVING`) - The SavedModel will load in TensorFlow Serving and supports the [Predict API](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/predict.proto). To use the Classify, Regress, or MultiInference APIs, please use either [tf.Estimator](https://www.tensorflow.org/api_docs/python/tf/estimator/Estimator) or the lower level [SavedModel APIs](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md). - Some TensorFlow ops depend on information on disk or other information called "assets". These are generally handled automatically by adding the assets to the `GraphKeys.ASSET_FILEPATHS` collection. Only assets in that collection are exported; if you need more custom behavior, you'll need to use the [SavedModelBuilder](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/builder.py).

More information about SavedModel and signatures can be found here: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md.
Parameters
object session
The TensorFlow session from which to save the meta graph and variables.
object export_dir
The path to which the SavedModel will be stored.
IDictionary<string, object> inputs
dict mapping string input names to tensors. These are added to the SignatureDef as the inputs.
IDictionary<string, object> outputs
dict mapping string output names to tensors. These are added to the SignatureDef as the outputs.
object legacy_init_op
Legacy support for op or group of ops to execute after the restore op upon a load.

object simple_save_dyn(object session, object export_dir, object inputs, object outputs, object legacy_init_op)

Convenience function to build a SavedModel suitable for serving. (deprecated)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.simple_save.

In many common cases, saving models for serving will be as simple as:

simple_save(session, export_dir, inputs={"x": x, "y": y}, outputs={"z": z})

Although in many cases it's not necessary to understand all of the many ways to configure a SavedModel, this method has a few practical implications: - It will be treated as a graph for inference / serving (i.e. uses the tag `saved_model.SERVING`) - The SavedModel will load in TensorFlow Serving and supports the [Predict API](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/predict.proto). To use the Classify, Regress, or MultiInference APIs, please use either [tf.Estimator](https://www.tensorflow.org/api_docs/python/tf/estimator/Estimator) or the lower level [SavedModel APIs](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md). - Some TensorFlow ops depend on information on disk or other information called "assets". These are generally handled automatically by adding the assets to the `GraphKeys.ASSET_FILEPATHS` collection. Only assets in that collection are exported; if you need more custom behavior, you'll need to use the [SavedModelBuilder](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/builder.py).

More information about SavedModel and signatures can be found here: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md.
Parameters
object session
The TensorFlow session from which to save the meta graph and variables.
object export_dir
The path to which the SavedModel will be stored.
object inputs
dict mapping string input names to tensors. These are added to the SignatureDef as the inputs.
object outputs
dict mapping string output names to tensors. These are added to the SignatureDef as the outputs.
object legacy_init_op
Legacy support for op or group of ops to execute after the restore op upon a load.

Public properties

PythonFunctionContainer build_signature_def_fn get;

PythonFunctionContainer build_tensor_info_fn get;

PythonFunctionContainer classification_signature_def_fn get;

PythonFunctionContainer contains_saved_model_fn_ get;

PythonFunctionContainer get_tensor_from_tensor_info_fn get;

PythonFunctionContainer is_valid_signature_fn get;

PythonFunctionContainer load_v2_fn get;

PythonFunctionContainer main_op_with_restore_fn get;

PythonFunctionContainer predict_signature_def_fn get;

PythonFunctionContainer regression_signature_def_fn get;

PythonFunctionContainer simple_save_fn get;

Public fields

int SAVED_MODEL_SCHEMA_VERSION

return int

string VARIABLES_FILENAME

return string

string ASSETS_DIRECTORY

return string

string CLASSIFY_METHOD_NAME

return string

string CLASSIFY_OUTPUT_SCORES

return string

string CLASSIFY_INPUTS

return string

string REGRESS_INPUTS

return string

string ASSETS_KEY

return string

string CLASSIFY_OUTPUT_CLASSES

return string

string SAVED_MODEL_FILENAME_PB

return string

string DEFAULT_SERVING_SIGNATURE_DEF_KEY

return string

string PREDICT_INPUTS

return string

string REGRESS_METHOD_NAME

return string

string LEGACY_INIT_OP_KEY

return string

string SAVED_MODEL_FILENAME_PBTXT

return string

string PREDICT_METHOD_NAME

return string

string REGRESS_OUTPUTS

return string

string VARIABLES_DIRECTORY

return string

string MAIN_OP_KEY

return string

string PREDICT_OUTPUTS

return string

string SERVING

return string

string TRAINING

return string

string GPU

return string

string TPU

return string