Regularization
Regularization refers to a set of techniques designed to prevent overfitting and improve generalization. A prominent example is the addition of terms to the loss function in order to restrict the magnitude of the learned weights (e.g. L1 or L2 regularization).