LostTech.TensorFlow : API Documentation

Type CoupledInputForgetGateLSTMCell

Namespace tensorflow.contrib.rnn

Parent RNNCell

Interfaces ICoupledInputForgetGateLSTMCell

Long short-term memory unit (LSTM) recurrent network cell.

The default non-peephole implementation is based on:

https://pdfs.semanticscholar.org/1154/0131eae85b2e11d53df7f1360eeb6476e7f4.pdf

Felix Gers, Jurgen Schmidhuber, and Fred Cummins. "Learning to forget: Continual prediction with LSTM." IET, 850-855, 1999.

The peephole implementation is based on:

https://research.google.com/pubs/archive/43905.pdf

Hasim Sak, Andrew Senior, and Francoise Beaufays. "Long short-term memory recurrent neural network architectures for large scale acoustic modeling." INTERSPEECH, 2014.

The coupling of input and forget gate is based on:

http://arxiv.org/pdf/1503.04069.pdf

Greff et al. "LSTM: A Search Space Odyssey"

The class uses optional peep-hole connections, and an optional projection layer. Layer normalization implementation is based on:

https://arxiv.org/abs/1607.06450.

"Layer Normalization" Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton

and is applied before the internal nonlinearities.

Methods

Properties

Public static methods

CoupledInputForgetGateLSTMCell NewDyn(object num_units, ImplicitContainer<T> use_peepholes, object initializer, object num_proj, object proj_clip, ImplicitContainer<T> num_unit_shards, ImplicitContainer<T> num_proj_shards, ImplicitContainer<T> forget_bias, ImplicitContainer<T> state_is_tuple, ImplicitContainer<T> activation, object reuse, ImplicitContainer<T> layer_norm, ImplicitContainer<T> norm_gain, ImplicitContainer<T> norm_shift)

Initialize the parameters for an LSTM cell.
Parameters
object num_units
int, The number of units in the LSTM cell
ImplicitContainer<T> use_peepholes
bool, set True to enable diagonal/peephole connections.
object initializer
(optional) The initializer to use for the weight and projection matrices.
object num_proj
(optional) int, The output dimensionality for the projection matrices. If None, no projection is performed.
object proj_clip
(optional) A float value. If `num_proj > 0` and `proj_clip` is provided, then the projected values are clipped elementwise to within `[-proj_clip, proj_clip]`.
ImplicitContainer<T> num_unit_shards
How to split the weight matrix. If >1, the weight matrix is stored across num_unit_shards.
ImplicitContainer<T> num_proj_shards
How to split the projection matrix. If >1, the projection matrix is stored across num_proj_shards.
ImplicitContainer<T> forget_bias
Biases of the forget gate are initialized by default to 1 in order to reduce the scale of forgetting at the beginning of the training.
ImplicitContainer<T> state_is_tuple
If True, accepted and returned states are 2-tuples of the `c_state` and `m_state`. By default (False), they are concatenated along the column axis. This default behavior will soon be deprecated.
ImplicitContainer<T> activation
Activation function of the inner states.
object reuse
(optional) Python boolean describing whether to reuse variables in an existing scope. If not `True`, and the existing scope already has the given variables, an error is raised.
ImplicitContainer<T> layer_norm
If `True`, layer normalization will be applied.
ImplicitContainer<T> norm_gain
float, The layer normalization gain initial value. If `layer_norm` has been set to `False`, this argument will be ignored.
ImplicitContainer<T> norm_shift
float, The layer normalization shift initial value. If `layer_norm` has been set to `False`, this argument will be ignored.

Public properties

PythonFunctionContainer activity_regularizer get; set;

object activity_regularizer_dyn get; set;

bool built get; set;

object dtype get;

object dtype_dyn get;

bool dynamic get;

object dynamic_dyn get;

object graph get;

object graph_dyn get;

IList<Node> inbound_nodes get;

object inbound_nodes_dyn get;

IList<object> input get;

object input_dyn get;

object input_mask get;

object input_mask_dyn get;

IList<object> input_shape get;

object input_shape_dyn get;

object input_spec get; set;

object input_spec_dyn get; set;

IList<object> losses get;

object losses_dyn get;

IList<object> metrics get;

object metrics_dyn get;

object name get;

object name_dyn get;

object name_scope get;

object name_scope_dyn get;

IList<object> non_trainable_variables get;

object non_trainable_variables_dyn get;

IList<object> non_trainable_weights get;

object non_trainable_weights_dyn get;

IList<object> outbound_nodes get;

object outbound_nodes_dyn get;

IList<object> output get;

object output_dyn get;

object output_mask get;

object output_mask_dyn get;

object output_shape get;

object output_shape_dyn get;

object output_size get;

Integer or TensorShape: size of outputs produced by this cell.

object output_size_dyn get;

Integer or TensorShape: size of outputs produced by this cell.

object PythonObject get;

object rnncell_scope get; set;

string scope_name get;

object scope_name_dyn get;

object state_size get;

size(s) of state(s) used by this cell.

It can be represented by an Integer, a TensorShape or a tuple of Integers or TensorShapes.

object state_size_dyn get;

size(s) of state(s) used by this cell.

It can be represented by an Integer, a TensorShape or a tuple of Integers or TensorShapes.

bool stateful get; set;

ValueTuple<object> submodules get;

object submodules_dyn get;

bool supports_masking get; set;

bool trainable get; set;

object trainable_dyn get; set;

object trainable_variables get;

object trainable_variables_dyn get;

IList<object> trainable_weights get;

object trainable_weights_dyn get;

IList<object> updates get;

object updates_dyn get;

object variables get;

object variables_dyn get;

IList<object> weights get;

object weights_dyn get;