Type ReLU
Namespace tensorflow.keras.layers
Parent Layer
Interfaces IReLU
Rectified Linear Unit activation function. With default values, it returns element-wise `max(x, 0)`. Otherwise, it follows:
`f(x) = max_value` for `x >= max_value`,
`f(x) = x` for `threshold <= x < max_value`,
`f(x) = negative_slope * (x - threshold)` otherwise. Input shape:
Arbitrary. Use the keyword argument `input_shape`
(tuple of integers, does not include the samples axis)
when using this layer as the first layer in a model. Output shape:
Same shape as the input.
Methods
Properties
- activity_regularizer
- activity_regularizer_dyn
- built
- dtype
- dtype_dyn
- dynamic
- dynamic_dyn
- inbound_nodes
- inbound_nodes_dyn
- input
- input_dyn
- input_mask
- input_mask_dyn
- input_shape
- input_shape_dyn
- input_spec
- input_spec_dyn
- losses
- losses_dyn
- max_value
- metrics
- metrics_dyn
- name
- name_dyn
- name_scope
- name_scope_dyn
- negative_slope
- non_trainable_variables
- non_trainable_variables_dyn
- non_trainable_weights
- non_trainable_weights_dyn
- outbound_nodes
- outbound_nodes_dyn
- output
- output_dyn
- output_mask
- output_mask_dyn
- output_shape
- output_shape_dyn
- PythonObject
- stateful
- submodules
- submodules_dyn
- support_masking
- supports_masking
- threshold
- trainable
- trainable_dyn
- trainable_variables
- trainable_variables_dyn
- trainable_weights
- trainable_weights_dyn
- updates
- updates_dyn
- variables
- variables_dyn
- weights
- weights_dyn