Normalization is a data preprocessing technique used to scale numeric features in a dataset to a standard range, this ensures that all input features have similar scales, which can help algorithms converge faster during training and prevent certain features from dominating others solely due to their larger magnitude.
Some examples of types of normalization are: