Type StatsAggregator
Namespace tensorflow.data.experimental
Parent PythonObjectContainer
Interfaces IStatsAggregator
A stateful resource that aggregates statistics from one or more iterators. To record statistics, use one of the custom transformation functions defined
in this module when defining your
tf.data.Dataset
. All statistics will be
aggregated by the `StatsAggregator` that is associated with a particular
iterator (see below). For example, to record the latency of producing each
element by iterating over a dataset:
To associate a `StatsAggregator` with a tf.data.Dataset
object, use
the following pattern:
To get a protocol buffer summary of the currently aggregated statistics,
use the `StatsAggregator.get_summary()` tensor. The easiest way to do this
is to add the returned tensor to the tf.GraphKeys.SUMMARIES
collection,
so that the summaries will be included with any existing summaries.
Note: This interface is experimental and expected to change. In particular,
we expect to add other implementations of `StatsAggregator` that provide
different ways of exporting statistics, and add more types of statistics.
Show Example
dataset =... dataset = dataset.apply(tf.data.experimental.latency_stats("total_bytes"))
Methods
Properties
Public instance methods
Tensor get_summary()
Returns a string
tf.Tensor
that summarizes the aggregated statistics. The returned tensor will contain a serialized `tf.compat.v1.summary.Summary`
protocol
buffer, which can be used with the standard TensorBoard logging facilities.
object get_summary_dyn()
Returns a string
tf.Tensor
that summarizes the aggregated statistics. The returned tensor will contain a serialized `tf.compat.v1.summary.Summary`
protocol
buffer, which can be used with the standard TensorBoard logging facilities.
Returns
-
object
- A scalar string
tf.Tensor
that summarizes the aggregated statistics.