What is Batch Normalization?
Batch Normalization is a technique used to improve the training stability and performance of deep
neural networks. It normalizes the activations of each layer by subtracting the batch mean and dividing by the batch standard deviation, reducing the internal covariate shift problem. Batch Normalization can accelerate training, improve generalization, and reduce sensitivity to
hyperparameters.