Type BatchNormalization
Namespace tensorflow.keras.layers
Parent BatchNormalizationBase
Interfaces IBatchNormalization
Base class of Batch normalization layer (Ioffe and Szegedy, 2014). Normalize the activations of the previous layer at each batch,
i.e. applies a transformation that maintains the mean activation
close to 0 and the activation standard deviation close to 1.
Properties
- activity_regularizer
- activity_regularizer_dyn
- adjustment
- axis
- beta
- beta_constraint
- beta_initializer
- beta_regularizer
- built
- center
- dtype
- dtype_dyn
- dynamic
- dynamic_dyn
- epsilon
- fused
- gamma
- gamma_constraint
- gamma_initializer
- gamma_regularizer
- inbound_nodes
- inbound_nodes_dyn
- input
- input_dyn
- input_mask
- input_mask_dyn
- input_shape
- input_shape_dyn
- input_spec
- input_spec_dyn
- losses
- losses_dyn
- metrics
- metrics_dyn
- momentum
- moving_mean
- moving_mean_initializer
- moving_stddev
- moving_variance
- moving_variance_initializer
- name
- name_dyn
- name_scope
- name_scope_dyn
- non_trainable_variables
- non_trainable_variables_dyn
- non_trainable_weights
- non_trainable_weights_dyn
- outbound_nodes
- outbound_nodes_dyn
- output
- output_dyn
- output_mask
- output_mask_dyn
- output_shape
- output_shape_dyn
- PythonObject
- renorm
- renorm_clipping
- renorm_mean
- renorm_momentum
- renorm_stddev
- scale
- stateful
- submodules
- submodules_dyn
- supports_masking
- trainable
- trainable_dyn
- trainable_variables
- trainable_variables_dyn
- trainable_weights
- trainable_weights_dyn
- updates
- updates_dyn
- variables
- variables_dyn
- virtual_batch_size
- weights
- weights_dyn