Type Iterator
Namespace tensorflow.data
Parent PythonObjectContainer
Represents the state of iterating through a `Dataset`.
Methods
- from_string_handle
- from_string_handle
- from_string_handle
- from_string_handle
- from_string_handle_dyn
- from_structure
- from_structure
- from_structure
- from_structure
- from_structure
- from_structure
- from_structure_dyn
- make_initializer
- make_initializer
- make_initializer
- make_initializer
- make_initializer_dyn
- string_handle
- string_handle_dyn
Properties
Public instance methods
object make_initializer(Dataset dataset, string name)
Returns a
tf.Operation
that initializes this iterator on `dataset`.
Parameters
-
Dataset
dataset - A `Dataset` with compatible structure to this iterator.
-
string
name - (Optional.) A name for the created operation.
Returns
-
object
- A
tf.Operation
that can be run to initialize this iterator on the given `dataset`.
object make_initializer(BatchDataset dataset, string name)
Returns a
tf.Operation
that initializes this iterator on `dataset`.
Parameters
-
BatchDataset
dataset - A `Dataset` with compatible structure to this iterator.
-
string
name - (Optional.) A name for the created operation.
Returns
-
object
- A
tf.Operation
that can be run to initialize this iterator on the given `dataset`.
object make_initializer(DatasetV1Adapter dataset, string name)
Returns a
tf.Operation
that initializes this iterator on `dataset`.
Parameters
-
DatasetV1Adapter
dataset - A `Dataset` with compatible structure to this iterator.
-
string
name - (Optional.) A name for the created operation.
Returns
-
object
- A
tf.Operation
that can be run to initialize this iterator on the given `dataset`.
object make_initializer(RepeatDataset dataset, string name)
Returns a
tf.Operation
that initializes this iterator on `dataset`.
Parameters
-
RepeatDataset
dataset - A `Dataset` with compatible structure to this iterator.
-
string
name - (Optional.) A name for the created operation.
Returns
-
object
- A
tf.Operation
that can be run to initialize this iterator on the given `dataset`.
object make_initializer_dyn(object dataset, object name)
Returns a
tf.Operation
that initializes this iterator on `dataset`.
Parameters
-
object
dataset - A `Dataset` with compatible structure to this iterator.
-
object
name - (Optional.) A name for the created operation.
Returns
-
object
- A
tf.Operation
that can be run to initialize this iterator on the given `dataset`.
Tensor string_handle(string name)
object string_handle_dyn(object name)
Public static methods
Iterator from_string_handle(IGraphNodeBase string_handle, DType output_types, IEnumerable<object> output_shapes, object output_classes)
Creates a new, uninitialized `Iterator` based on the given handle. This method allows you to define a "feedable" iterator where you can choose
between concrete iterators by feeding a value in a
tf.Session.run
call.
In that case, `string_handle` would be a `tf.compat.v1.placeholder`, and you
would
feed it with the value of tf.data.Iterator.string_handle
in each step. For example, if you had two iterators that marked the current position in
a training dataset and a test dataset, you could choose which to use in
each step as follows:
Parameters
-
IGraphNodeBase
string_handle - A scalar
tf.Tensor
of typetf.string
that evaluates to a handle produced by the `Iterator.string_handle()` method. -
DType
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
IEnumerable<object>
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
Iterator
- An `Iterator`.
Show Example
train_iterator = tf.data.Dataset(...).make_one_shot_iterator() train_iterator_handle = sess.run(train_iterator.string_handle()) test_iterator = tf.data.Dataset(...).make_one_shot_iterator() test_iterator_handle = sess.run(test_iterator.string_handle()) handle = tf.compat.v1.placeholder(tf.string, shape=[]) iterator = tf.data.Iterator.from_string_handle( handle, train_iterator.output_types) next_element = iterator.get_next() loss = f(next_element) train_loss = sess.run(loss, feed_dict={handle: train_iterator_handle}) test_loss = sess.run(loss, feed_dict={handle: test_iterator_handle})
Iterator from_string_handle(IGraphNodeBase string_handle, DType output_types, TensorShape output_shapes, object output_classes)
Creates a new, uninitialized `Iterator` based on the given handle. This method allows you to define a "feedable" iterator where you can choose
between concrete iterators by feeding a value in a
tf.Session.run
call.
In that case, `string_handle` would be a `tf.compat.v1.placeholder`, and you
would
feed it with the value of tf.data.Iterator.string_handle
in each step. For example, if you had two iterators that marked the current position in
a training dataset and a test dataset, you could choose which to use in
each step as follows:
Parameters
-
IGraphNodeBase
string_handle - A scalar
tf.Tensor
of typetf.string
that evaluates to a handle produced by the `Iterator.string_handle()` method. -
DType
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
TensorShape
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
Iterator
- An `Iterator`.
Show Example
train_iterator = tf.data.Dataset(...).make_one_shot_iterator() train_iterator_handle = sess.run(train_iterator.string_handle()) test_iterator = tf.data.Dataset(...).make_one_shot_iterator() test_iterator_handle = sess.run(test_iterator.string_handle()) handle = tf.compat.v1.placeholder(tf.string, shape=[]) iterator = tf.data.Iterator.from_string_handle( handle, train_iterator.output_types) next_element = iterator.get_next() loss = f(next_element) train_loss = sess.run(loss, feed_dict={handle: train_iterator_handle}) test_loss = sess.run(loss, feed_dict={handle: test_iterator_handle})
Iterator from_string_handle(IEnumerable<object> string_handle, DType output_types, TensorShape output_shapes, object output_classes)
Creates a new, uninitialized `Iterator` based on the given handle. This method allows you to define a "feedable" iterator where you can choose
between concrete iterators by feeding a value in a
tf.Session.run
call.
In that case, `string_handle` would be a `tf.compat.v1.placeholder`, and you
would
feed it with the value of tf.data.Iterator.string_handle
in each step. For example, if you had two iterators that marked the current position in
a training dataset and a test dataset, you could choose which to use in
each step as follows:
Parameters
-
IEnumerable<object>
string_handle - A scalar
tf.Tensor
of typetf.string
that evaluates to a handle produced by the `Iterator.string_handle()` method. -
DType
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
TensorShape
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
Iterator
- An `Iterator`.
Show Example
train_iterator = tf.data.Dataset(...).make_one_shot_iterator() train_iterator_handle = sess.run(train_iterator.string_handle()) test_iterator = tf.data.Dataset(...).make_one_shot_iterator() test_iterator_handle = sess.run(test_iterator.string_handle()) handle = tf.compat.v1.placeholder(tf.string, shape=[]) iterator = tf.data.Iterator.from_string_handle( handle, train_iterator.output_types) next_element = iterator.get_next() loss = f(next_element) train_loss = sess.run(loss, feed_dict={handle: train_iterator_handle}) test_loss = sess.run(loss, feed_dict={handle: test_iterator_handle})
Iterator from_string_handle(IEnumerable<object> string_handle, DType output_types, IEnumerable<object> output_shapes, object output_classes)
Creates a new, uninitialized `Iterator` based on the given handle. This method allows you to define a "feedable" iterator where you can choose
between concrete iterators by feeding a value in a
tf.Session.run
call.
In that case, `string_handle` would be a `tf.compat.v1.placeholder`, and you
would
feed it with the value of tf.data.Iterator.string_handle
in each step. For example, if you had two iterators that marked the current position in
a training dataset and a test dataset, you could choose which to use in
each step as follows:
Parameters
-
IEnumerable<object>
string_handle - A scalar
tf.Tensor
of typetf.string
that evaluates to a handle produced by the `Iterator.string_handle()` method. -
DType
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
IEnumerable<object>
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
Iterator
- An `Iterator`.
Show Example
train_iterator = tf.data.Dataset(...).make_one_shot_iterator() train_iterator_handle = sess.run(train_iterator.string_handle()) test_iterator = tf.data.Dataset(...).make_one_shot_iterator() test_iterator_handle = sess.run(test_iterator.string_handle()) handle = tf.compat.v1.placeholder(tf.string, shape=[]) iterator = tf.data.Iterator.from_string_handle( handle, train_iterator.output_types) next_element = iterator.get_next() loss = f(next_element) train_loss = sess.run(loss, feed_dict={handle: train_iterator_handle}) test_loss = sess.run(loss, feed_dict={handle: test_iterator_handle})
object from_string_handle_dyn(object string_handle, object output_types, object output_shapes, object output_classes)
Creates a new, uninitialized `Iterator` based on the given handle. This method allows you to define a "feedable" iterator where you can choose
between concrete iterators by feeding a value in a
tf.Session.run
call.
In that case, `string_handle` would be a `tf.compat.v1.placeholder`, and you
would
feed it with the value of tf.data.Iterator.string_handle
in each step. For example, if you had two iterators that marked the current position in
a training dataset and a test dataset, you could choose which to use in
each step as follows:
Parameters
-
object
string_handle - A scalar
tf.Tensor
of typetf.string
that evaluates to a handle produced by the `Iterator.string_handle()` method. -
object
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
object
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
object
- An `Iterator`.
Show Example
train_iterator = tf.data.Dataset(...).make_one_shot_iterator() train_iterator_handle = sess.run(train_iterator.string_handle()) test_iterator = tf.data.Dataset(...).make_one_shot_iterator() test_iterator_handle = sess.run(test_iterator.string_handle()) handle = tf.compat.v1.placeholder(tf.string, shape=[]) iterator = tf.data.Iterator.from_string_handle( handle, train_iterator.output_types) next_element = iterator.get_next() loss = f(next_element) train_loss = sess.run(loss, feed_dict={handle: train_iterator_handle}) test_loss = sess.run(loss, feed_dict={handle: test_iterator_handle})
Iterator from_structure(ValueTuple<DType, object, object> output_types, IEnumerable<object> output_shapes, string shared_name, object output_classes)
Creates a new, uninitialized `Iterator` with the given structure. This iterator-constructing method can be used to create an iterator that
is reusable with many different datasets. The returned iterator is not bound to a particular dataset, and it has
no `initializer`. To initialize the iterator, run the operation returned by
`Iterator.make_initializer(dataset)`. The following is an example
Parameters
-
ValueTuple<DType, object, object>
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
IEnumerable<object>
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
string
shared_name - (Optional.) If non-empty, this iterator will be shared under the given name across multiple sessions that share the same devices (e.g. when using a remote server).
-
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
Iterator
- An `Iterator`.
Show Example
iterator = Iterator.from_structure(tf.int64, tf.TensorShape([])) dataset_range = Dataset.range(10) range_initializer = iterator.make_initializer(dataset_range) dataset_evens = dataset_range.filter(lambda x: x % 2 == 0) evens_initializer = iterator.make_initializer(dataset_evens) # Define a model based on the iterator; in this example, the model_fn # is expected to take scalar tf.int64 Tensors as input (see # the definition of 'iterator' above). prediction, loss = model_fn(iterator.get_next()) # Train for `num_epochs`, where for each epoch, we first iterate over # dataset_range, and then iterate over dataset_evens. for _ in range(num_epochs): # Initialize the iterator to `dataset_range` sess.run(range_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break # Initialize the iterator to `dataset_evens` sess.run(evens_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break
Iterator from_structure(DType output_types, ValueTuple<IEnumerable<object>, object, object> output_shapes, string shared_name, object output_classes)
Creates a new, uninitialized `Iterator` with the given structure. This iterator-constructing method can be used to create an iterator that
is reusable with many different datasets. The returned iterator is not bound to a particular dataset, and it has
no `initializer`. To initialize the iterator, run the operation returned by
`Iterator.make_initializer(dataset)`. The following is an example
Parameters
-
DType
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
ValueTuple<IEnumerable<object>, object, object>
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
string
shared_name - (Optional.) If non-empty, this iterator will be shared under the given name across multiple sessions that share the same devices (e.g. when using a remote server).
-
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
Iterator
- An `Iterator`.
Show Example
iterator = Iterator.from_structure(tf.int64, tf.TensorShape([])) dataset_range = Dataset.range(10) range_initializer = iterator.make_initializer(dataset_range) dataset_evens = dataset_range.filter(lambda x: x % 2 == 0) evens_initializer = iterator.make_initializer(dataset_evens) # Define a model based on the iterator; in this example, the model_fn # is expected to take scalar tf.int64 Tensors as input (see # the definition of 'iterator' above). prediction, loss = model_fn(iterator.get_next()) # Train for `num_epochs`, where for each epoch, we first iterate over # dataset_range, and then iterate over dataset_evens. for _ in range(num_epochs): # Initialize the iterator to `dataset_range` sess.run(range_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break # Initialize the iterator to `dataset_evens` sess.run(evens_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break
Iterator from_structure(DType output_types, IEnumerable<object> output_shapes, string shared_name, object output_classes)
Creates a new, uninitialized `Iterator` with the given structure. This iterator-constructing method can be used to create an iterator that
is reusable with many different datasets. The returned iterator is not bound to a particular dataset, and it has
no `initializer`. To initialize the iterator, run the operation returned by
`Iterator.make_initializer(dataset)`. The following is an example
Parameters
-
DType
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
IEnumerable<object>
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
string
shared_name - (Optional.) If non-empty, this iterator will be shared under the given name across multiple sessions that share the same devices (e.g. when using a remote server).
-
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
Iterator
- An `Iterator`.
Show Example
iterator = Iterator.from_structure(tf.int64, tf.TensorShape([])) dataset_range = Dataset.range(10) range_initializer = iterator.make_initializer(dataset_range) dataset_evens = dataset_range.filter(lambda x: x % 2 == 0) evens_initializer = iterator.make_initializer(dataset_evens) # Define a model based on the iterator; in this example, the model_fn # is expected to take scalar tf.int64 Tensors as input (see # the definition of 'iterator' above). prediction, loss = model_fn(iterator.get_next()) # Train for `num_epochs`, where for each epoch, we first iterate over # dataset_range, and then iterate over dataset_evens. for _ in range(num_epochs): # Initialize the iterator to `dataset_range` sess.run(range_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break # Initialize the iterator to `dataset_evens` sess.run(evens_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break
Iterator from_structure(ValueTuple<DType, object, object> output_types, TensorShape output_shapes, string shared_name, object output_classes)
Creates a new, uninitialized `Iterator` with the given structure. This iterator-constructing method can be used to create an iterator that
is reusable with many different datasets. The returned iterator is not bound to a particular dataset, and it has
no `initializer`. To initialize the iterator, run the operation returned by
`Iterator.make_initializer(dataset)`. The following is an example
Parameters
-
ValueTuple<DType, object, object>
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
TensorShape
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
string
shared_name - (Optional.) If non-empty, this iterator will be shared under the given name across multiple sessions that share the same devices (e.g. when using a remote server).
-
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
Iterator
- An `Iterator`.
Show Example
iterator = Iterator.from_structure(tf.int64, tf.TensorShape([])) dataset_range = Dataset.range(10) range_initializer = iterator.make_initializer(dataset_range) dataset_evens = dataset_range.filter(lambda x: x % 2 == 0) evens_initializer = iterator.make_initializer(dataset_evens) # Define a model based on the iterator; in this example, the model_fn # is expected to take scalar tf.int64 Tensors as input (see # the definition of 'iterator' above). prediction, loss = model_fn(iterator.get_next()) # Train for `num_epochs`, where for each epoch, we first iterate over # dataset_range, and then iterate over dataset_evens. for _ in range(num_epochs): # Initialize the iterator to `dataset_range` sess.run(range_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break # Initialize the iterator to `dataset_evens` sess.run(evens_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break
Iterator from_structure(ValueTuple<DType, object, object> output_types, ValueTuple<IEnumerable<object>, object, object> output_shapes, string shared_name, object output_classes)
Creates a new, uninitialized `Iterator` with the given structure. This iterator-constructing method can be used to create an iterator that
is reusable with many different datasets. The returned iterator is not bound to a particular dataset, and it has
no `initializer`. To initialize the iterator, run the operation returned by
`Iterator.make_initializer(dataset)`. The following is an example
Parameters
-
ValueTuple<DType, object, object>
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
ValueTuple<IEnumerable<object>, object, object>
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
string
shared_name - (Optional.) If non-empty, this iterator will be shared under the given name across multiple sessions that share the same devices (e.g. when using a remote server).
-
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
Iterator
- An `Iterator`.
Show Example
iterator = Iterator.from_structure(tf.int64, tf.TensorShape([])) dataset_range = Dataset.range(10) range_initializer = iterator.make_initializer(dataset_range) dataset_evens = dataset_range.filter(lambda x: x % 2 == 0) evens_initializer = iterator.make_initializer(dataset_evens) # Define a model based on the iterator; in this example, the model_fn # is expected to take scalar tf.int64 Tensors as input (see # the definition of 'iterator' above). prediction, loss = model_fn(iterator.get_next()) # Train for `num_epochs`, where for each epoch, we first iterate over # dataset_range, and then iterate over dataset_evens. for _ in range(num_epochs): # Initialize the iterator to `dataset_range` sess.run(range_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break # Initialize the iterator to `dataset_evens` sess.run(evens_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break
Iterator from_structure(DType output_types, TensorShape output_shapes, string shared_name, object output_classes)
Creates a new, uninitialized `Iterator` with the given structure. This iterator-constructing method can be used to create an iterator that
is reusable with many different datasets. The returned iterator is not bound to a particular dataset, and it has
no `initializer`. To initialize the iterator, run the operation returned by
`Iterator.make_initializer(dataset)`. The following is an example
Parameters
-
DType
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
TensorShape
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
string
shared_name - (Optional.) If non-empty, this iterator will be shared under the given name across multiple sessions that share the same devices (e.g. when using a remote server).
-
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
Iterator
- An `Iterator`.
Show Example
iterator = Iterator.from_structure(tf.int64, tf.TensorShape([])) dataset_range = Dataset.range(10) range_initializer = iterator.make_initializer(dataset_range) dataset_evens = dataset_range.filter(lambda x: x % 2 == 0) evens_initializer = iterator.make_initializer(dataset_evens) # Define a model based on the iterator; in this example, the model_fn # is expected to take scalar tf.int64 Tensors as input (see # the definition of 'iterator' above). prediction, loss = model_fn(iterator.get_next()) # Train for `num_epochs`, where for each epoch, we first iterate over # dataset_range, and then iterate over dataset_evens. for _ in range(num_epochs): # Initialize the iterator to `dataset_range` sess.run(range_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break # Initialize the iterator to `dataset_evens` sess.run(evens_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break
object from_structure_dyn(object output_types, object output_shapes, object shared_name, object output_classes)
Creates a new, uninitialized `Iterator` with the given structure. This iterator-constructing method can be used to create an iterator that
is reusable with many different datasets. The returned iterator is not bound to a particular dataset, and it has
no `initializer`. To initialize the iterator, run the operation returned by
`Iterator.make_initializer(dataset)`. The following is an example
Parameters
-
object
output_types - A nested structure of
tf.DType
objects corresponding to each component of an element of this dataset. -
object
output_shapes - (Optional.) A nested structure of
tf.TensorShape
objects corresponding to each component of an element of this dataset. If omitted, each component will have an unconstrainted shape. -
object
shared_name - (Optional.) If non-empty, this iterator will be shared under the given name across multiple sessions that share the same devices (e.g. when using a remote server).
-
object
output_classes - (Optional.) A nested structure of Python `type` objects
corresponding to each component of an element of this iterator. If
omitted, each component is assumed to be of type
tf.Tensor
.
Returns
-
object
- An `Iterator`.
Show Example
iterator = Iterator.from_structure(tf.int64, tf.TensorShape([])) dataset_range = Dataset.range(10) range_initializer = iterator.make_initializer(dataset_range) dataset_evens = dataset_range.filter(lambda x: x % 2 == 0) evens_initializer = iterator.make_initializer(dataset_evens) # Define a model based on the iterator; in this example, the model_fn # is expected to take scalar tf.int64 Tensors as input (see # the definition of 'iterator' above). prediction, loss = model_fn(iterator.get_next()) # Train for `num_epochs`, where for each epoch, we first iterate over # dataset_range, and then iterate over dataset_evens. for _ in range(num_epochs): # Initialize the iterator to `dataset_range` sess.run(range_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break # Initialize the iterator to `dataset_evens` sess.run(evens_initializer) while True: try: pred, loss_val = sess.run([prediction, loss]) except tf.errors.OutOfRangeError: break
Public properties
object element_spec get;
The type specification of an element of this iterator.
object element_spec_dyn get;
The type specification of an element of this iterator.
object initializer get;
A
tf.Operation
that should be run to initialize this iterator.
object initializer_dyn get;
A
tf.Operation
that should be run to initialize this iterator.
object output_classes get;
Returns the class of each component of an element of this iterator. (deprecated) Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version.
Instructions for updating:
Use `tf.compat.v1.data.get_output_classes(iterator)`. The expected values are
tf.Tensor
and tf.SparseTensor
.
object output_classes_dyn get;
Returns the class of each component of an element of this iterator. (deprecated) Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version.
Instructions for updating:
Use `tf.compat.v1.data.get_output_classes(iterator)`. The expected values are
tf.Tensor
and tf.SparseTensor
.
object output_shapes get;
Returns the shape of each component of an element of this iterator. (deprecated) Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version.
Instructions for updating:
Use `tf.compat.v1.data.get_output_shapes(iterator)`.
object output_shapes_dyn get;
Returns the shape of each component of an element of this iterator. (deprecated) Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version.
Instructions for updating:
Use `tf.compat.v1.data.get_output_shapes(iterator)`.
object output_types get;
Returns the type of each component of an element of this iterator. (deprecated) Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version.
Instructions for updating:
Use `tf.compat.v1.data.get_output_types(iterator)`.
object output_types_dyn get;
Returns the type of each component of an element of this iterator. (deprecated) Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version.
Instructions for updating:
Use `tf.compat.v1.data.get_output_types(iterator)`.