Type estimator
Namespace tensorflow_estimator.contrib.estimator
Methods
- add_metrics
- add_metrics_dyn
- binary_classification_head
- binary_classification_head_dyn
- boosted_trees_classifier_train_in_memory
- boosted_trees_classifier_train_in_memory
- boosted_trees_classifier_train_in_memory_dyn
- boosted_trees_regressor_train_in_memory
- boosted_trees_regressor_train_in_memory
- boosted_trees_regressor_train_in_memory_dyn
- build_supervised_input_receiver_fn_from_input_fn
- build_supervised_input_receiver_fn_from_input_fn_dyn
- call_logit_fn
- call_logit_fn_dyn
- clip_gradients_by_norm
- clip_gradients_by_norm
- clip_gradients_by_norm
- clip_gradients_by_norm_dyn
- DNNClassifierWithLayerAnnotations
- DNNClassifierWithLayerAnnotations_dyn
- DNNRegressorWithLayerAnnotations
- DNNRegressorWithLayerAnnotations_dyn
- Equals
- export_all_saved_models
- export_all_saved_models_dyn
- export_saved_model_for_mode
- export_saved_model_for_mode_dyn
- forward_features
- forward_features_dyn
- logistic_regression_head
- logistic_regression_head_dyn
- make_input_layer_with_layer_annotations
- make_input_layer_with_layer_annotations_dyn
- multi_class_head
- multi_class_head_dyn
- multi_label_head
- multi_label_head_dyn
- poisson_regression_head
- poisson_regression_head_dyn
- regression_head
- regression_head_dyn
- replicate_model_fn_
- replicate_model_fn__dyn
- serialize_feature_column
- serialize_feature_column_dyn
- wrap_and_check_input_tensors
- wrap_and_check_input_tensors
- wrap_and_check_input_tensors
- wrap_and_check_input_tensors_dyn
Properties
- _MultiHead_fn
- _MultiLabelHead_fn
- _TransformGradients_fn
- _VariableDistributionMode_fn
- add_metrics_fn
- binary_classification_head_fn
- boosted_trees_classifier_train_in_memory_fn
- boosted_trees_regressor_train_in_memory_fn
- build_supervised_input_receiver_fn_from_input_fn_fn
- call_logit_fn_fn
- clip_gradients_by_norm_fn
- DNNClassifierWithLayerAnnotations_fn
- DNNRegressorWithLayerAnnotations_fn
- export_all_saved_models_fn
- export_saved_model_for_mode_fn
- forward_features_fn
- LayerAnnotationsCollectionNames_fn
- logistic_regression_head_fn
- make_input_layer_with_layer_annotations_fn
- multi_class_head_fn
- multi_label_head_fn
- poisson_regression_head_fn
- regression_head_fn
- replicate_model_fn__fn
- RNNClassifier_fn
- RNNEstimator_fn
- serialize_feature_column_fn
- ServingInputReceiver_fn
- SupervisedInputReceiver_fn
- TensorServingInputReceiver_fn
- TowerOptimizer_fn
- UnsupervisedInputReceiver_fn
- USE_DEFAULT
- USE_DEFAULT_dyn
- wrap_and_check_input_tensors_fn
Public instance methods
bool Equals(object obj)
Public static methods
Estimator add_metrics(TimeSeriesRegressor estimator, PythonFunctionContainer metric_fn)
object add_metrics_dyn(object estimator, object metric_fn)
Creates a new
tf.estimator.Estimator
which has given metrics. Example:
Example usage of custom metric which uses features:
Parameters
-
object
estimator - A
tf.estimator.Estimator
object. -
object
metric_fn - A function which should obey the following signature: - Args: can only have following four arguments in any order: * predictions: Predictions `Tensor` or dict of `Tensor` created by given `estimator`. * features: Input `dict` of `Tensor` objects created by `input_fn` which is given to `estimator.evaluate` as an argument. * labels: Labels `Tensor` or dict of `Tensor` created by `input_fn` which is given to `estimator.evaluate` as an argument. * config: config attribute of the `estimator`. - Returns: Dict of metric results keyed by name. Final metrics are a union of this and `estimator's` existing metrics. If there is a name conflict between this and `estimator`s existing metrics, this will override the existing one. The values of the dict are the results of calling a metric function, namely a `(metric_tensor, update_op)` tuple.
Returns
-
object
- A new
tf.estimator.Estimator
which has a union of original metrics with given ones.
Show Example
def my_auc(labels, predictions): auc_metric = tf.keras.metrics.AUC(name="my_auc") auc_metric.update_state(y_true=labels, y_pred=predictions['logistic']) return {'auc': auc_metric} estimator = tf.estimator.DNNClassifier(...) estimator = tf.estimator.add_metrics(estimator, my_auc) estimator.train(...) estimator.evaluate(...)
_BinaryLogisticHeadWithSigmoidCrossEntropyLoss binary_classification_head(object weight_column, object thresholds, object label_vocabulary, ImplicitContainer<T> loss_reduction, PythonFunctionContainer loss_fn, Nullable<int> name)
object binary_classification_head_dyn(object weight_column, object thresholds, object label_vocabulary, ImplicitContainer<T> loss_reduction, object loss_fn, object name)
_BoostedTreesBase boosted_trees_classifier_train_in_memory(PythonFunctionContainer train_input_fn, object feature_columns, object model_dir, ImplicitContainer<T> n_classes, object weight_column, object label_vocabulary, int n_trees, int max_depth, double learning_rate, double l1_regularization, double l2_regularization, double tree_complexity, double min_node_weight, object config, object train_hooks, bool center_bias, string pruning_mode, double quantile_sketch_epsilon)
_BoostedTreesBase boosted_trees_classifier_train_in_memory(PythonFunctionContainer train_input_fn, object feature_columns, object model_dir, int n_classes, object weight_column, object label_vocabulary, int n_trees, int max_depth, double learning_rate, double l1_regularization, double l2_regularization, double tree_complexity, double min_node_weight, object config, object train_hooks, bool center_bias, string pruning_mode, double quantile_sketch_epsilon)
object boosted_trees_classifier_train_in_memory_dyn(object train_input_fn, object feature_columns, object model_dir, ImplicitContainer<T> n_classes, object weight_column, object label_vocabulary, ImplicitContainer<T> n_trees, ImplicitContainer<T> max_depth, ImplicitContainer<T> learning_rate, ImplicitContainer<T> l1_regularization, ImplicitContainer<T> l2_regularization, ImplicitContainer<T> tree_complexity, ImplicitContainer<T> min_node_weight, object config, object train_hooks, ImplicitContainer<T> center_bias, ImplicitContainer<T> pruning_mode, ImplicitContainer<T> quantile_sketch_epsilon)
_BoostedTreesBase boosted_trees_regressor_train_in_memory(PythonFunctionContainer train_input_fn, object feature_columns, object model_dir, int label_dimension, object weight_column, int n_trees, int max_depth, double learning_rate, double l1_regularization, double l2_regularization, double tree_complexity, double min_node_weight, object config, object train_hooks, bool center_bias, string pruning_mode, double quantile_sketch_epsilon)
_BoostedTreesBase boosted_trees_regressor_train_in_memory(PythonFunctionContainer train_input_fn, object feature_columns, object model_dir, ImplicitContainer<T> label_dimension, object weight_column, int n_trees, int max_depth, double learning_rate, double l1_regularization, double l2_regularization, double tree_complexity, double min_node_weight, object config, object train_hooks, bool center_bias, string pruning_mode, double quantile_sketch_epsilon)
object boosted_trees_regressor_train_in_memory_dyn(object train_input_fn, object feature_columns, object model_dir, ImplicitContainer<T> label_dimension, object weight_column, ImplicitContainer<T> n_trees, ImplicitContainer<T> max_depth, ImplicitContainer<T> learning_rate, ImplicitContainer<T> l1_regularization, ImplicitContainer<T> l2_regularization, ImplicitContainer<T> tree_complexity, ImplicitContainer<T> min_node_weight, object config, object train_hooks, ImplicitContainer<T> center_bias, ImplicitContainer<T> pruning_mode, ImplicitContainer<T> quantile_sketch_epsilon)
object build_supervised_input_receiver_fn_from_input_fn(PythonFunctionContainer input_fn, IDictionary<string, object> input_fn_args)
object build_supervised_input_receiver_fn_from_input_fn_dyn(object input_fn, IDictionary<string, object> input_fn_args)
PythonFunctionContainer call_logit_fn(PythonFunctionContainer logit_fn, object features, object mode, object params, object config)
Calls logit_fn (experimental). THIS FUNCTION IS EXPERIMENTAL. Keras layers/models are the recommended APIs
for logit and model composition. A utility function that calls the provided logit_fn with the relevant subset
of provided arguments. Similar to tf.estimator._call_model_fn().
Parameters
-
PythonFunctionContainer
logit_fn - A logit_fn as defined above.
-
object
features - The features dict.
-
object
mode - TRAIN / EVAL / PREDICT ModeKeys.
-
object
params - The hyperparameter dict.
-
object
config - The configuration object.
Returns
-
PythonFunctionContainer
- A logit Tensor, the output of logit_fn.
object call_logit_fn_dyn(object logit_fn, object features, object mode, object params, object config)
Calls logit_fn (experimental). THIS FUNCTION IS EXPERIMENTAL. Keras layers/models are the recommended APIs
for logit and model composition. A utility function that calls the provided logit_fn with the relevant subset
of provided arguments. Similar to tf.estimator._call_model_fn().
Parameters
-
object
logit_fn - A logit_fn as defined above.
-
object
features - The features dict.
-
object
mode - TRAIN / EVAL / PREDICT ModeKeys.
-
object
params - The hyperparameter dict.
-
object
config - The configuration object.
Returns
-
object
- A logit Tensor, the output of logit_fn.