Type PeepholeLSTMCell
Namespace tensorflow.keras.experimental
Parent LSTMCell
Interfaces IPeepholeLSTMCell
Equivalent to LSTMCell class but adds peephole connections. Peephole connections allow the gates to utilize the previous internal state as
well as the previous hidden state (which is what LSTMCell is limited to).
This allows PeepholeLSTMCell to better learn precise timings over LSTMCell. From [Gers et al.](http://www.jmlr.org/papers/volume3/gers02a/gers02a.pdf): "We find that LSTM augmented by 'peephole connections' from its internal
cells to its multiplicative gates can learn the fine distinction between
sequences of spikes spaced either 50 or 49 time steps apart without the help
of any short training exemplars." The peephole implementation is based on: [Long short-term memory recurrent neural network architectures for
large scale acoustic modeling.
](https://research.google.com/pubs/archive/43905.pdf) Example:
Show Example
# Create 2 PeepholeLSTMCells peephole_lstm_cells = [PeepholeLSTMCell(size) for size in [128, 256]] # Create a layer composed sequentially of the peephole LSTM cells. layer = RNN(peephole_lstm_cells) input = keras.Input((timesteps, input_dim)) output = layer(input)
Methods
- build
- build
- build
- build
- build
- call
- call
- call
- call
- call
- call
- call
- call
- call
- call
- call
- call
- get_dropout_mask_for_cell
- get_dropout_mask_for_cell
- get_dropout_mask_for_cell
- get_dropout_mask_for_cell
- get_dropout_mask_for_cell
- get_dropout_mask_for_cell
- get_dropout_mask_for_cell_dyn
- get_recurrent_dropout_mask_for_cell
- get_recurrent_dropout_mask_for_cell
- get_recurrent_dropout_mask_for_cell
- get_recurrent_dropout_mask_for_cell_dyn
- NewDyn
- reset_dropout_mask
- reset_dropout_mask_dyn
- reset_recurrent_dropout_mask
- reset_recurrent_dropout_mask_dyn
Properties
- activation
- activity_regularizer
- activity_regularizer_dyn
- bias
- bias_constraint
- bias_initializer
- bias_regularizer
- built
- dropout
- dtype
- dtype_dyn
- dynamic
- dynamic_dyn
- forget_gate_peephole_weights
- implementation
- inbound_nodes
- inbound_nodes_dyn
- input
- input_dyn
- input_gate_peephole_weights
- input_mask
- input_mask_dyn
- input_shape
- input_shape_dyn
- input_spec
- input_spec_dyn
- kernel
- kernel_constraint
- kernel_initializer
- kernel_regularizer
- losses
- losses_dyn
- metrics
- metrics_dyn
- name
- name_dyn
- name_scope
- name_scope_dyn
- non_trainable_variables
- non_trainable_variables_dyn
- non_trainable_weights
- non_trainable_weights_dyn
- outbound_nodes
- outbound_nodes_dyn
- output
- output_dyn
- output_gate_peephole_weights
- output_mask
- output_mask_dyn
- output_shape
- output_shape_dyn
- output_size
- PythonObject
- recurrent_activation
- recurrent_constraint
- recurrent_dropout
- recurrent_initializer
- recurrent_kernel
- recurrent_regularizer
- state_size
- stateful
- submodules
- submodules_dyn
- supports_masking
- trainable
- trainable_dyn
- trainable_variables
- trainable_variables_dyn
- trainable_weights
- trainable_weights_dyn
- unit_forget_bias
- units
- updates
- updates_dyn
- use_bias
- variables
- variables_dyn
- weights
- weights_dyn